var/home/core/zuul-output/0000755000175000017500000000000015157300045014525 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157313152015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000341130215157312765020266 0ustar corecoreikubelet.lognc9r~DYd` \-Hږ%C{sg5݁ϱ(Ӄis$WU)X6 /\x6b}Wߟ/nțx{w~{N_Ç/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCN?qKf, # qe䧤 ss]QzH.ad!rJBi`V +|i}}THW{y|*/BP3m3A- ZPmN^iL[NrrݝE)~QGGAj^3}wy/{47[q)&c(޸0"$5ڪҾη*t:%?vEmO5tqÜ3Cyu '~qlN?}|nLFR6f8yWxYd ;K44|CK4UQviYDZh$#*)e\W$IAT;s0Gp}=9ڠedۜ+EaH#QtDV:?7#w4r_۾8ZJ%PgS!][5ߜQZ݇~P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&VOK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^`Η|ʣ)-iaE';_j{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=oi_)qb㼃{buU?zT u]68 QeC Hl @R SFZuU&uRz[2(A1ZK(O5dc}QQufCdX($0j(HX_$GZaPo|P5q @3ǟ6 mR!c/24مQNֆ^n,hU֝cfT :):[gCa?\&IpW$8!+Uph*/ o/{")qq҈78݇hA sTB*F$6 2C` |ɧJ~iM cO;m#NV?d?TCg5otޔC1s`u.EkB6ga׬9J2&vV,./ӐoQJ*Dw*^?sCeyWtɖ9F.[-cʚmD (QMW`zP~n"U'8%kEq*Lr;TY *BCCpJhxUpܺDoGdlaQ&8#v| (~~yZ-VW"T- 0@4Vc̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?O"d.wEр%}5zWˬQOS)ZbF p$^(2JцQImuzhpyXڈ2ͤh}/[g1ieQ*-=hiך5J))?' c9*%WyΈ W\Of[=߰+ednU$YD',jߎW&7DXǜߍG`DbE#0Y4&|޻xѷ\;_Z^sнM\&+1gWo'Y;l>V ̍"ޛ4tO,{=hFѓ$b =D(zn;Y<1x~SJ^{vn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7bd+, Z`ͲH-nမ^WbPFtOfD]c9\w+ea~~{;Vm >|WAޭi`HbIãE{%&4]Iw Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_958]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ??R<lb#P-^39T|L /~p│x@Bq"M/lja\b݋af LnU*P(8W[U6WX ZoѶ^SH:K:%Qvl\b FqQI.ȨHWo;Nw$͹O$oEE-eq=.*Dp,V;(bgJ!gF)892sw*+{[or@x,))[o新#.͞.;=fc<)((b۲Eumw峛M2,V[cm,S~ AF~.2v?JNt=O7^r.@DEuU1}g$>8ac#sĢB\PIPfwJQJ;Qxm &GBf\ZA$Ba-z|A-I @x70 晪MV)m8[6-Te@`E|=U D(C{oVa*H7MQK"<O%MTTt\EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8G9gZ:ʆy<>cak=x_AI9ןْaˎOPOUbClԳWVw[odՕq_*ڔC"1Ȋ/R0ڹ.WF4 3vFf#8QVa5ڂ:)@֒ {I7!UՇEIl HƗbd#HȠh#CBΤ{sTQ Žڒ;̼ob%D Ɖ'`Pj KkD4<80: M'֥P!r"o\㣣@ 9n# " $]fGv9]!sd 2WEP$ŻPN\y{4η_\)UK> ۣ,j|r6oSU;BKŨv!➴`5TDiy@}?:MC!1ns9{9@-=X(i7=nAH2\Gd$&iA|iSQiݸ| ̢rJ>dF؀8ihЎ3*S>ch]L## .D%ucDvWM·S>)-vFb:Waŕ,3yiq$ՁɄ)KٮQtCfX^=ӛ՞snt.+UY~,#x2lVǻXET3NsXTPH9?0Q[ TF)㢥= [ ΢ D'OzCTSoN~Ub?,nΪʽw c%lv1ùLAzpv#[nv &ǭ0K3W*͈ w42@.y3bBA{kpRA ?IY9 ?|-z'}~kf ‰!,igJlw5!7GNܲ7T5WyIwGHέ ioNoԦ=!&d.09DH_O'˿mȠ%9cS.  GE|GKiOM8Yۣxݭl|DDbZ:BՑ^GnKjro$}E% l܎CNb&7!5Y'3K4Wc8|ۘ<$*YPF~%=^ sBcits2uŝ9m~ҳM}r:ОE ӞmI&ݍL :$~if~rݮ[B}ն ?iyKNo^Hh9ih565$:$I; ^sO^%4i" Z^Zm#GL`,ȃ1F\$' )O߫G=!6c+#   %: l%6EC{0塴׫3˴t8TjɻUL+d6KgQF k=^!r":#iI L5\KZ<)eM|$B"!ڽ@%HSB2Z/"BfmB!;}2m!$%Eü%WO ol%islԓ)Hs`ebOɍUg7VK~,!3`RYeO:F)OY]),mC])e.u㠥Ѻ\j#yX5U9K$})SW7%vvqj.,wê Vg:hcUb6}3.^T8F}&*syp@.vޜ[ur`F+Mv|w`.%!n р#ݴtMƏp}3ܛjL}1ޛK}܌>EEpuRws 5t76w/m/`5H|녹mQ.$>N |w8 ͜IH{R.wYA-CD#yA SrA GY1E-$vP4N0#C'dv޸?;^jMmK LCJ4blPwx KG) gp/SI]ۡCZUc+~O8AcI/9,5y*LF ZPCƏI~.M >!CEGHw@5u Y;Z'adS{%+aS[RG~>֡,%M{-#SY(Hʪ)e# L[-6V؆ʹ;T$pZ+@L;3?0]"26(䷛1Ĺ=|T$dW  kn=6%Őnafgw1nxK-ҾVsVoNkw&4*H'\ d$MI+69"]Ms|~8Ën]f)o `VZ3JO'| { QkK|h>,\Φ>4&$/"]1D5[W%3=-$h`EbDC;.j0X1dR? ^}Ԯ=Kqi4~sKA<׹s">W(}]bC8;KƒKf~ofXJRF>h XʝAQGsW%9qZ|Z䛇 z<9![>&bmzTL̵F8ՃX>\G ߞo)QȆE 3U>gC}Y }CR~IBK+{P.T! =ĦӨgb d<>RHrK7^J/ť_Y/ݼWw eAJipFr{W7^.hd}%1Nٿ4afݗe,1/xqS3m!'(͖5Q&xo 8;'LD3} 2Syt1?Y'O]^4b[ :,n^S?2_3\y=8?n# 7o}e IcuQ-ǡ=grѣ%z} b#s@*иrI?1kGU%|Ix=I}&ݢ6ɢ-?읉8FHޗ4DKK v2UTMUӜ=Bx@o> 2|J0êϓ"O+mmBzp-p޴&Z'Y-B6-[4 u>r,8?>7uCC5F %Ն R Cu8?28¢E We; P0:"nlMvOezR= ]â. U`V% CQX v#'Nv%j1^ܒZB$*c.)H ?`5[Z!}nliצ) ݆Y~ cPwίX"{!9V0~\`#U8V$}bpNU`ZS̯窜8L^O+m-q6E¶8^ SxR1Xj~=rs>NdMLmtїUJ8" kcMlf2ǽyWLiXCaSmMÙBgr7[ Nʇ)bAgX'f0]agB-:YSokUdE6c>Ql~JJ#`~#Eh3ŕs,|HrVh7m]Q!ӥSVB l)DzJ |0M>8l WIf|\8U*hг˅r-3'^\  [Cr: QvOS}ll>ŰAVG YңK\rnɪq(u$Bk|TUН}h!8l?= S7!8bdEuK<^F hxndSD,Xt]1Gm*h%)(=XUza^&6"BzξH( ."uw>.,SzbQ!g:0r$ضz['.!-r"1MCMu(kP|еcLqr/Hi%(azry !5yY~ :t>gY #)v緢!BY)Hc(<|qDOIW^9oJb^;sö!`hDZoe@JLw9|fb>ѺRce w0/ qo%HWs\4ݞtf@Զ߼/ȇlCf0C`~ ƾ}O+D7 P=xD`^_[d0Yݎ@z BO2۽k%xe'NԋKI`Yu>7v$Rf[A'2)b>~w!I.WwϏߍ|~x(qUߤ^ӵyx^TE.7Cu/Z6m.Ǖօ^ߋӽfz./ZܗWrn*up+t\OW U-C_wS!|q?E-S_w$-%9?nwh{R 5mZM{=x$ FM8G0Z'Y-X )@V7G)zL~dپ&+<вFzr?㘟,q~,b6)hRgbiRgH1_'9(ɟSYpŘ-ŶםG+qY1౬CxAX4xK"5XoITd E$Z[NdwS:֢̆ ?GЅ'JƖ'ZX t{O֠U sVAqFJc,Ih: ݢkNwE 4eaK0Z / > <|d= }L3Qf xdRhEB{s \ d5;*Ijdɥ(ȊV£55N ۃm-߂8 `6w+ x-f  E\ڈ t-nɭ\h%^wWt!{q.lrDsΒV I-*F~L!Gf"Lf0OWv?"1] `W+7m4e逼p +{6g߷Pg%,IR-Ř`QbmүcH&CLlv`ScivG'հu7.c 61spH5SCt)eNqǪP@o`c/#rv6*;WJ;[.)4R\=V~ׅ@~>_)iqT 8;DQs@4¤>mlN"jek-R9~ {]'##AAwLѲVdJ.ԫiE׬܏ȱlR~voWP~ RTM#rQίE|s`poSfbpnVGIGTq3JC`,$P_KxB[cz)ut0F>v1AWHй qPq3TN9W⌎I Nʇ)bHb.uD`X}nl}\hSؗNTlѩyۢUdE1LӰWlU&Z!hoҒ"HgKX 6 -{zwҌleb}nltwfCEI"*o'Cpl0\neή]kY8Cgcu1!_Ɩ} 8$ Ўa 2+ም+A[QA%Oع „&䒿e;]0:|IIc(R&#ӓecE,dp&nPS '4쾔h|OO-D#ڥ4bZsH{ g3JDcSKu9k-nUAG`2y*8F*-Rz_zҗЩsd6|`U>F*KZmR;~:xI/iFIu]dsEGزM+ycF;bH,te[(6Fƈ^ʮd3 b`[c1髰?(o$[e(l^ȣg3K?nEq!*RZMCpnse~ Fzq.8vz;j}ln2e UZAZv+QD4ͺ|@lX)ĕ̽=Tse+ϙK$s`hnOcE(T#|\*& >Vv:V^4R7[{u%[^3GGL']9~yL :!Szbvc2.R3`Hz)OfJlӪVWF fK6%OOE-kmW`l[ۯc_^ ޱpcoK{[#U jt*h2V߳`JBNcFSUȿ1:WѥLf2HHh)b>~w!ޤ&\UxsnnqCq]w߃뽽){Q=\rw*}-Om )l?7zju_݋yZTuusn Qww^"W|{_ӎ'|;.n@7 >I m-8NJ\NA څB}>Mh/A8?_;qŽ6xmPf_Sp6aMUAP*tLn$l{=lq&Ϻ/j璳PܸRKm;RCsT$Un2@ "G{}ZI>U֞ = ƹ%>Hŋrw/t:7?mY#_ΗR In6m.p^,JlV]xHcvhynMk<3Br$U-pPU"NPaNH<:DUH+%Ƃp^!/KJ.)1BٮFƗzZik#k:Z{a1uqeiNh8y]\ٺfqlvz. Kcfhh@ u _XGT5ϱھ0Ԅچ I@OiM7rKO }c0 /cy9qU mW;g8э/ﹽJa /[5]CE1""a4ӥa<}w|“6 |ֿ'<:"c+ҽf;vLKf+- nW[i!I?ۅ_4yH<&=',Ҁ&}ŧR25|qzk ԨX)kU!p ߓGYLTTy421'綫 <18Ǻ[~}z56T5՟+ /-( ~"r#\ozMPEXTxң!Lzr2Qqw/ޖ|@l/Z΂x=^ isQ*d-4&뺆L-t54 QS ôl빚E, ]OsA3?EO8<# B]4|<)VwBlUGi@ϓdR$˄4AZ|X-{`~X[2~l(k֔\(/\(\g,Q Hѿe$'P|T*c#邓Q9g6 cx x4el4/ˬx;y4 IIbRKA+_ ʛQXň'pZƝ*ސRYF vK菂fNr:`6*Ұ}OE BI(!UY % FT Z\=P"bE|AN6dv FR};dC+sUWF<5GƳiΞSz`2*(gF,(DL"]VyBYz'yjXkBF@ogbāGPU ̳^OoG_jk!OlL-QPg,B&ST9/b%,3uu9zɧKgwVU0mF]-a誅S|ը+!%J9'iUdvI2'QM )~M/}|wDQ\g@YcB?-0Po|z`?j'OhD)W*_̰E ثq)@x"\BLr_3.gp@OnڛQ0t!:k!s{s̷~c$*scMDִMpL,V*]{}"o8-B .et7v >K=׿\mX)~6||7XtT_M4t\XW$1fp)LI5GlZJxj#rB1OZC@- ^Q\Dk -jeB|!Ѹ 7J|L:xp0xvpr{8V:SF$ON ^|C3㌋c::mrs\Ll Q+pM*C׃܋˄<*Z4WD8! UWw$%ǔw$"p? qٹ5lx@u<-sFn5:S(,& }OJsxYGO[pc\19 8%h]nsDU]%EG#GG)ӄ@ hw`lrQLm~oܜoF,"1m5Ň#E!ª~~P\WBMm*URFҠkճt4YiU(p .kEmZO g2Y NLт(;(̟YH)K=R7hmgz8$Q>$g=YY{qWB|31%jQXhQlVPx $M(/vz/$`"6uG\Ciá1$T*Ky.|sh&n0- eB(Hmf"*-״ؽ8Hጸ0p_`8j#&e(Zfq?ja醌Z)45t#fxKw5K&g,cxC` hcK)g_p:e: t4~\rCARpXi6hJ+|9[K027{ L}8G8¶~Gmhª )$S@$f+)PWz[% `kupdps2O1gA$N\9yNx.ż*(193*zߊXti8sq8n8`1QO&ִ(!WCFlU-^Wf!'KF ǤycP,Rl+Kr5g2糹Rżp:\\ѵ G]Sr EϏ)FUGB)*1Ia>+ HH]O;`O*2kcd]YBԯpO=W'KV!5̓E+[ZoKSw4UUn @ԗ5ET,27sHF^.^Q;g1Qd^ԀVE0*}+ϱdè*曾s<ה܆ Kbn_H;!=)?).GUNWn.[R"Ncvǚn5h벍5.y;j o},&{y>^w'iqy>R u&ZQMF=PMpUFI6]KdvˑnKX)hPy#hKB0'YֺR4=%b0ǪB`[1[w1'e5 i v~Sl%yV@j^xg"CK6,8[9}]VȔu *HY2gjUhrƏq V5 aa؛q~V׃o]յqda)vő.^,%w=Xgn7U==)Y(@X,Is="h̹e@L &XwꥎHw>&lHVaٜ,zߴnZMv ^mJ܆2*Sʲc[8܇=#Yǖye@R+m9\܀ܱeRu4Ye'G?t?JRrNKțu^?GAˬ:"N!3YAf{Ϛ+ͯL-{PlxB:͍Lh<ХR7P}nZ lԉ'Y Dr@\M,F9ݠ':in`l뻴u qo3=_apO ;LW#}4ְNyy7,;+A1O];%@7,}۝P`=;ĝ@s(̵;GtÀpt2N}=p7݀8K(vDlj;,vi' h*U ZG(雄wC_7tA'o;avC0̒6p Ѧx& 'l RvpO&"Q7H 흼qX7@wݐ#l1؈~<5BCb|%ӄdFLP& 4ҦwQwhmλ%p{O@)L'b.L0iRodAڒa-CbA%/0y~J]Z|M$I u>`Gm2UBLznk!, ua? fz=KR|_Mg2XV~mZf$m) ރ,95<QRb?}s= 揌!#5,Dx-yhudݰz{G YzU{"X[ջW'G|sgN.oϏO^t|N(_!tlYqy%L. ߅C\;P:J0~w8 Po]),QP$sx:BuF(R X Y)vH0K[:"!Aqށ ţ8UP%$a'"$Lbu*S:(Yb09ۋPPC%XQ%:An0 c؉ sj;A` mE2( ѡG1l3$Kt*a$ @1P(1DaYBqJ. ?KC ҋ(($ SCTt!T I H=!vNLDC4 Sy|Vioy0Dqo!V7VHIF./z{a<0$ & n:C 7<4IQ.?4z1F}d2628k33bayә<%$X{wCyݠGߕXw's=:T iC#c:S*%{5)'Թخ Jc +P&C_aq(|0bkhLUUVɫȚ&ՌCJɿ>'8Kt!]ܿIU>V3:J0cm'ԋ='!;ȇ'WW9`X ǣj ń0iXYI*96I"D~Ff%Դ,Q/X6Ya_]ʽ|[!Hn';2M rL{]3\4eo0*ٛcg7hjLP3%,ʵm5y, ^Qi!6- (Őz 5m5 D2;e72|;1 ̎L[rrL~,sFsXejF^bCqgi)r& /Z#2le.4y}{`tx2Ӡy { ? !`N2WŲ"9@d좘dRpħ7,N { +Gz6qlX?LVx9ܰ˲|U|ъaMFdNuzBQh'y:[gtZQgOjz5/6`:N(м'䆑J:g2@ ʴ:ئ|"KLI&&{:4zY )ĥZR/IEȪgH]ZnnDj(|pw~$v̱nہ s ~o+ƵjfM|( ~ ;^,qHS,lUxdpKxd :h;=y%D~=Q*N\'(q1 a6k2Egc,X- U.U 9 )'2㛋ŀzsޔ]Åuac}X*>^Q[kG[ pc4JGٽYY>\`|[Q~0XL0Y/^ @ 㣿ӞOQt果}5deu<> Gz9y; ] H]KL/1 M>.;tO`xƮ{9)Akyqy;ю8o>9 /9XClYyIlЫ?ϯ3<?E0C/\<]֢R S·k>i w>( mdFf`8d4Ä 3Rʤ{6M4F7 Iq,W,hr1?ts;W#e5~H'|,r#6džVU9q A䇏Ԇk":M5rOxM2kGWSFiVVv+EJ_-n6#vWk y& * zʫ46=9 ճ&xXa (`y 6‹RӀmR엟/y>k_Ve[ʶ'(QPk]ʶSC={ڭl~]حzvݷowmzYBXg{YDuvBPwUPw AuQPw A| A; ʷ[BPo{AvBPUP AQP AUA- 4O`GA- W 4^p?A 4Z4Bh{Av4 R.a60CTDjyRNcՋP[ .'LcqMQzFȇ"캜HzU+<$`<̫$UNЋ`LZ~*W@LQcGm=^>.yK_Ӎ̒bWsޟyLb2"Lwq0V:Mstq$-z9I~}d6^gHx:rSS,ҟVeLJjy8ß8 :YUݱLܴ C[Z裦kr y*&BGԍ.(1pOi.%x?tF 4( lI4ɿ}VlN뙥NH՗'zTy?ՒoG^gC޽8Ê(-{Dϫ`ǻ K5S /:|y։. 3h<9d>>VUapXf,\Wub2,4~P$=\5-3̷jx2֫r(C}:U2n}S8Y\ 0 OpˬWKaֽ2 %5pu^ﯓ \5ͅ'" b*DCaCͯ$zQ ()o6@(a*7@w oV`-=bW(|' Y0/DRdj t~ȓW7I ,?S6J HKaj1 F{魄LoTmTfgEV.lFbhba '//D=]oKgJ@}t*IJ yd9H_"ǝ$Ә %fX7kb*HU]#ع"2loFyP(mdڽФljI~}K\MgpXVAhYhA8cC20{xPM vos-}m߳;>yڨ& +( 6[P8&E, iFli-Jnsq zRTd mҬ}N)BZЍT-Y iAmY}F->wQBr[l+}*fJ#.-n#Hj i2-OxuQua'-UVNn*Ø63TNfX;L&$K+vU++ ~{I3ja }ņ2jӾ먟Co4GMAR-F..HB1l` =i'L:J<88|(=`}#Dk ?D_oPD(l#F I$IȢ=/ { Lp8,@ʹI܉wtR>D\Ei{ͦqksZ:J@w.8q7'HuD"uȚO/}+[H+7 7&7mԇx,}"x 8 M+C1_*YviYU79aL5 1|L2)cbpΛ4밍Æ 䶍A;u `jӅGW{[R,'{sЖ?_>>tXƽM6((#;}UO2mm$ҤGy>w[;lZ23I틊̶FQđHyF'㯈Qa7ƦVsfIZnR4oN'7 LVw-qF_zM~(bf'EȪ*j:"oR5eoZ"JDV/{EۏٮFuV_WhO3goak|/(Ͽw}IOU2}W\V+ʉW7~|WS鏛io]zbW1퓎h-\{yщDgNgd~j&;|7몵peKl1O(n|9gX=45LӸ߯pt vd\`oW57]Yz[K,} ,%vVcq"lԼ'Q 'Ӓꥑ,X(A~ ǗKljW87PNl֯8=*MպͰT6ڛN.]#Ѫ8$pRyV ,9kX[RҲ"qDXZ਋Go⨵ʸWi$XXgD+CGG8O# @D3@N>rΪ&my%gN # r,&y\LVY2^1e؉Tfh8k8Z1dzURe}\m6#;JZ^5I0r( iwD+5e+F78[ňtO$8XxXV}uLĬ1 1X| h04kPsŚ; N36愶kAX^ԓwPf5D 'N .1];36h_1r=@S.R$ !E @ltDƼ(.Ҵ1fzS`nr7]B(fO½%''gXPlB2Es&ύ #7D |}OWclep^ciᕯ;{*djT7Hpc-]!ڂd7K BccVk%,W6lN`i;(i4JĘa#Xһ cya(--Ka{/E9ahXZ]HRfUQE(#WŞ%1{d@ҪM tYL-q]\ocui; v=N^3;'R2ւץb^R0fةMö^BC=*zÞ:SZyHUd:'CXb'MٕZ:'VDtK_XWTsBJ+c5z4>(K@BccI8p*k$8ڱ\9X>RLM6s-p8b,-%(eҔƬs-3A/N*Xӓ=]QG**Gcѓ*>݀5A\34_(3\)v $U)pn3` ֯*:̘Iᐬtx<)p W@G \i9CC*X|?)ƪ ~`6zd"S$8Q1RLܱ]İBeD ~ ,:/ܪ:g-y2$8*(+f`8kc(t % R+UT[(l8VUz/*ov.z`q!{$H 6sY>ǫ1VRXKyxt$8(FZjcJ>+%F?Wv\zTBspzv_* *Spu[D ,%x0|cFO~~ߴ/SvGR,|ХMqLyb\,B&a<).B?@{V6A-[Ѫ؇@Fc.1^)x<ߑ(וFc҈"~IpԃqL[K[ /qAp[KHƊ<K Z j,9Iњzj+= V7#( ]sFXi h}/Ahk߷$VyǎJhnW%Jqj/i ZԼw:\: H ܄`bfnbP6iêfEL[D[:O-"<Q,3R_}%:^2 WDZI}c3iRn#ݪu;G.v~2|V.7+L#-Bo:;Z.YhxeB↭]tLۀ_h`5sAΕ6ϝlsCwKA`@|{~:'K3yy7&Ç"q%.UI?t1ҏoc?8za(I%onhd[#riBfIȼ݂D ƮjʌȬr޹UA8 P0.nC9<5GP÷8287t_aLϢF*/W1\:z+{M6\kjY_3v:=fԂifth+EyR*Ę9+Ш>xĥYy "~GMtLG#n1~cK @@4 .QA*_P`kEPXdFqNciTg_,3ƌ7~m`A Cfs)0!!lj<~kV cyѲIBl%K΍`,-~l$8:*w/V5\;zvi|8Kg."v ڜ'q;p֐fB–ݓL6K_dtS2R? ^ LbG2-u+F" BQ(FfK6[0#4VKb $﷊c!v$x~gy:,=Lg VۂjpYt^YjB\/hp"ͦ+m 힌n*:*WEg.P5K#1wK8-{D_ri < LD`yԷm{&@VTq yQL44#v--u(fp^'>Cd$,B˜!w#~eV/f+jy_sQw6$8ڱ!JTix E)a%c@f%Z[k 0\ڍ3+n&OEGY[)\DT}2*߷ӳ.Ue0qKf?jɕs\I_`W6G("HS1[7CTRkZ9RȞ*-bt}npRݔY)5F7$Ѳ3=_ҥk[Wܢ(jp,-,5.l {K.}E>c5^kxy uu["gP"NO<—>6!.O7"08s^&ͬR2-4&R2jiEX"d3ap2Fe0x wuՊ3`>t5#ػƍ,UMf|} nO1Iѝ(VQvwWEڢd%h0l;@jɬ-maFhVoMLi*ݪU%ֶo}۾Ietq{mKeodTMk"',Qo]l<`ڭ|8H|5Zel3K$[]Skk?MCߒ/x׉G=`{|OL+yuy/2˃IM&E2G |dhΛ+ݙµHn):`,= a 轟Lj$&p l@^%Iup*1 D+·? hLd Ldq$tNI-\)_fYx0Fק0oyq9ȭV~bn 'QJ 9.+a/ZJt4?(Ǚ O^GPn@$>zWޫ)n=Nb2ucxp8 W2lC; G+˱A 'ML(#rWn0xK_x#w1ي! „}V\rZFcth~ '3`6Ǐ-Y)yP -S&Q4N`>$(+ʬ|:{L =Ox]Dt e~0(Y?=$%7X!@qk(fGSRG2N07bΫ>LId&QJ}_0>%LLF4ޣaVL?pFnL*ݭ:rD<-*~eC tmyJ(;3T 洺stkUJjo}sPiwəS? M8 *s'}`dLp27ѲH!Y J;&_q/O+;.(oJrmE\D@[8_cK-7M:rru%vի}Sr >T]Y?6V(yohH-@4rם'8&ڊu#"&jM>2p{s02aCj_ED(yKGd.4u^GC/zO =zOX|M>9D|v{\/*gc(ұ)()Q9V9e&۲꾱7CoZ Bl-Wf|=a译;wZ~ZL|{Lv__CYG֒[D0NP!32c=m=_r8j,Z,ڊov_1S'J ej;m'755 aM%8_X3=5q&0K5x_(Q̽@q8\P!QZ}Qmf9D}fBg.s#޻KTXg!g)E<`f#fAd3uogq&Hca XxfD-2)(sn|SmbSkza5=q<}Vf;9i_V .6n$­Y]bB8g\wS/L&q̓Qnٛe! tВX4_8!xa ,.㦂/5V# q䥊*+XR ^m1dC>Ǭ>Z_%^@_a5I^ٿQE .=MԳn_~3Oȩ廅Ka#ϣrSX^<,`Bzp.&b{ÉX, b,3(dEmW^3w}3BE|2Ag}ڸY| LtW0?S*f_ㄬ-KLBY+<g3|*eV"| .'QPRȗ!bĺS }J}uc3D^%;|nX/8ƭQ&=rIkvO JE8 UL`H(cO 6FɰE4,yXwߑ =؀XRyt=]D 'qO'Fp~9՗i`ck3eRkЫT eePe1I#O}'Y>p'@6(af.qA@o 5}vSwx֎pڵd,rAɄx0#1HrG&!1ͥ.ÀUV tihQ#hu(Ie]JLG%. NjF 5c%m%)]yՊfR7l0|µƎ{p)I2H#4ũ'8AN Yz50kd%M:B,qR~* 86l0姖4Svoy`q`[0QӭH>"\?;`Q u:'_8a˚:ym&4;'ݓSlAaEejkFKIh.*,fӴz4AwOm8n(8\ȦT %w8n)9 M@:"Q <s]BUJ^Rp$kz\cC;~$Pu7iۥiw-z,A1/LiO|(2,E##iJ\&DZfdSo~b:^[5.MJB0nIQO~_Uo~>\ {Wޕv??\7oo޽?v0/Y)==wf&݃pp1*M@2!r{TiJWh8SE >C7bv11Q "=vX@kO;D1a?54*L⃘~ Jʼ0"q.p('ў aj- |^Lʙ"kNX`FnS7fA> 0](οݯˤ}1ߺI)"^8/'YV53|$ r(ol1.CN_} FeT@*[.W;Ź_ T![T 5r%kK5Pqڋj "j+Y~9 *~jU:iV^:İ2fq2ɜT&sk3g(VA8yD (̑xjZc^ I@T[k3rBMF(KU2GĦ^''JuY It \2& 1*LXF9}J5Ñ&IFқ0)XNTX"-!Ɓ1DLfSd=ΎbVM̚uqHt$U(T\ X8Jp$(&}pା#J. G>H N`z rt4kwJI9#D)6(m -N ֡fڟ֨`͸h/iݽp|`ҘXƨjlf`](8q$$ϰ㒧L9њZi|<ύԬPrDZ$Bs}/L[,ZJR?R֛Zn1ͤvs7 Y=>Z{kw 2<(5˄Q%BzVR C;ej@bdV>R8J91V[ Bڱi [2f2N a/5T8htG^d k)VhdH9޶#Eo z&p &\whytQCN}H, Cѱmuĥ"sBCT!$1@@GZ-9RڑV5`T5axS*A`s,$Q2Z2p[4ů 1`eK|~Hj3f&4YcwBɈ&▤OTTg$̍R|T->dN-KSᎍ(Tc! *ԂHfH{Ms|;2`j>a`>+MA@Jf\I&.Å"HU4huqc#,3'D{H}X(qJ6rҭJАls\,64ncۤPn3J4LD+ 5Lcؘ b˲D[B 0m @3&`j$([RT@~F5 y|㧛  f6ub"~6ptM6DKP[bW.OIgH)KVL)+%S_36?P7wvz}T [q{@IIlJ4KJjJh,O>nf[H oQn@Gʼn3Xp>7D˝LT[>0RRz֭R4Ikypa﵍!`Id0BW-ECo+LQ>J_Ѷ&#t~[$kZdCORX`*H[W}`*>`&^+Y( (e!H7a{Y8Ȁ464B Z>U Iƒgo zz')~W2Z6(n.Glui~T)m]Ki,h٘mtipZ]܁6MICnI =`*y^lr'\ݲZ'KX ECeA?N`2[?,KUBѐV@_ ){;AwܗvZ~ju:i(}hh7ՉA7~?٨\toEXmj^L}pQ8v%GwwWP4sc|L s%g!rrмkP\a;Er[<9fߧ?}߼ՙ$ gʹIqܐX49OW0&@zzDu/ςA%4>{&8-PN3<DŽp'ĹUK)@Ā4_ȃ//蜮 +^qy5vteeO)Zߍ'DHzT^^]Wq4=}ԥOFYh/!;-`uO?qU-"?8v^.ۯF//{xHl jؿ&Jߧi{*RDL2VIC3v.h-jT3߼?+xHd|zOJ [Nt>zRz07-GA|*=PzCIrev8~F8\hx{9F#~`.(͏>a[uO)|5_LK/6c*9}2Ͳ7UdZ2^llSjqY 4F^R/LQämlW;v(]~(Nk)A;^ >CL^䎳{ %w&Sf Zvw|z1/UB4G}Z(MvVXq%1h|DAgb 6JcI<駿?9i¥^N3Ip|ց5}+ËMYoǺKڐ\SYs=u7Nd'dgy W\sEyW%U/]/-V^Ӫ " jCsAq$Ԋ잤7x(Q/9P(@5p8`C~HjLß%upM~Z 'R/ ^&WRX 85A#™ B֊KʆC-IY%7 Fd_q*#8õӕStA"D5\;;Rr;POE Zڧ@aM9/ΙtptSW2~jqU~adWr;&fU/rM~lZl+͞0[|䚽l|}vKK@RLPuDGL5Ff ƾbߗuU [--m8&RG˜Wbku<($]`?yKn;5+К"uaǝq>wv,Mh#8IYh=3+7ʽAtvTƄbYL*&el$[Sɔ=Qœ `$͕j9uԘoN-ao|<H%V 2nh`r];&1:&$=ebWlm6ؑ,`T\t2\F&1DIHhh 9mdTXjGbTvAêJщ VTPZ*=UUՅpajll:5Rj5RS. P. &uJ)$T0"Mvnon/ŔUMRRCTJ¸ RIkTIHMyy\awrlJfII0x!ŬL-ocmdzd9mf­?'Z>з- _W]EȡF]g8~;Jp1-=;ߠjB=}=0`rBb)o6ǰIʺn(":bYy8.!/O* rUU55DTUr#i(tB$ 54p}0jJ t+jF^Rqn*U\Q Wdj!!n#)$TRC4kS)9nS"jr-9l.6 66e+dJ0wdTKŦ]ldl4fJ.):ZnE xEFWJUPËP1NIFzZԴ 5T.֨J̸=`-E'^9ӏC{$MV"Wdx2|{NGuM4I%*6~QVSJ1 IreRՐeJcSiP[xV5χl闩[8 'Kyu|몼̻6Eޜ xr^ǘ 9&,3rȹ$q2e2j$9ftF:8ΉquV.p>f'E B0WJ "K^;)eThȡRn0W*M%[ \E CQZ;ЅUL,E6O2*11$<5Z,BTRlXH.Dj˘5Չ!P6#yjj܋ٽ}HֽUɉ+f_~DɄdQI(LbK-P?൦Қ2p:_*^q]5(etTb)qˑX啯 gig<#'By,g,]RK5\,!.Z v ) .Dj4RS]h*1IY =*OPU2۽@?m5}S$AJQn9z9c<@tmU)-Gh1 2LFܨX!r zdͳ) 9e3dʦ.mbkR>g EH-|@BrQ+`N^:yHqOB5W]ز+ߚ 퀱ZrhM{G2y$HeGCړq/4耔&vN( GX 9grF9=:~f 4Նu]qfd<`r닗GC/ph=uhSAd49޴b iů~uc; ֧ /x,L-|%jornaqQ҃_ m~eP J Jօ NG)/lh~}:T\ kHd:|+<'A7x gUb 655y!K3@G8:K`4{d4"&oE|\[.rQIKkhYV$ ơgyC"/8&,O7tDj`[_)d"ϰ7̎aؑ =;(ӕ=d<~,]jsR4%)/%EU3*Z$Q[saI:.|3gH3# \a"  3@9߽VB *,6@\)LX1DwHu]`)!q/be K/S0d422 ,bʘ:h}`jɠ;u+:+k Ā ,O<_'sAx@\?חH#|ZPb9="$n㝯f:=W{sk9S!xH$ cȐ X.o2g|l]c:.e#ͮS&)4I {MyԨ0`6eCab,h|uϗ4 BXlYDH<Lz茐A[Va  gN11d 0RNn y⪾6q i\wmSGuU^[kre}Υ0A=Lι szޡr.LΊ,;&6 "}AB厷8u͠rq(8%Vʤ4O>WVy21[9o;%"Ex+9Ľ;'3<'7ejBM 9o咕PC9]ո+ &o2j3("q֡Y8(#F)ehBR"P( K᤾frr%2dИeM2T45{ETl4UBsŖC  u@{kS$ A ژ`0yv@ [;gQ)2PV|5WG{eXFnRmYTϔ a;t}dm̔5LIZ!BVt?L=.#w BN2!m5PNi(Ac##' #rENA~7wY@Jk,`k?hb 8^_J3qJ(i~ 3ECTYK:A1#` JkU| Qq>) 2 Rpme }7冞y6]&xSvAJ/!hf 2/ge̷_yKN5qS96aK 4+s^ `QC&I'RaaF҄H|Ē.XdI?tu 0Nk͆Jp1k"Ѕe&vQ#Ȱ%BLbIX1cki*TAglJ(ìnSN\gh'u zN2A[q=G,1%sA'o2'Nќ1#go:b97B]?eŁ.]CIH1x$ufU%Z|^)~BvV VSLdX\ҿ?w?f d;nXw8mJ< j[;%P[ly؍jvq6lzV]/F`ή&YLong^iڱYϴBKMZPT2LQQۛ_QU~ǭU϶I2'UgBR%Oq})D_ ,]f >o@hSܪ;M۽Rt 2ݱ`ѡil.Q#n{4kw &坡j&߼nnaVgpGw6uA6䧟< UqSL~(헟mWɋ^]vbPJZp~1k@\o_WF-cli|Z;((8(Dk븩2wuLPPq5ŁDbb8A &"dx c60føa?(V J8.È n+̾r_)a95=A?.J$k!g_d5n)et F]߷(0꼸܄ɺ׵>EEv^WnAYx %bg쟛(Sxh,ij Q!ű7v jŋ<|" )o;}FT7I2K6|<8#4d&+##&xyD-OA6<1z.}ZTץT.h[ I/\ɡ1k|Sy[Z(UC^Si'W hOZmSE~]m;kd._ՕޡיEtomx-=fHu1E9wדYLG;dž7hԻMH]_Nowb|?M zS1Qh=0?wuqe̢?>ޯ 7ϻ򾾜<,F7~Z\ݖr?ŠQQZvɝXՎS]RW\tsֱ۟;%Mi6Y WZ##߽^=Wٓ~k1&vOnMf7W³e}*L #ĶZ q}#>/G~tw,8{#w^/ ?8>gӃpVq7w$~W_pcwqIo5{/"XXDrXM3_ܮn`Ve]ݦx2Z!/ {+͚(mHʻ _3\( /6n=!n\4؃jʊ\䢬&7m/l>]ޖ&mjQ/unep[:=%e_A{چ5څ瓏97kIC"^kT+*He8T,V7X,;a\W{Ã2h9*n!wKr[޶erٻFWA>$%խw q^&Y,60L`ԫg赤4ϩnj=(v7 )Hu:JXǹ)E;%i3GLH!f߇k=K23k5j۱MZoZo9"+ c)AVx2ǤeV+ =R>\kmQ>܊-4Pש|cnO^W6z|,>j Ic *`bϕ,-;3&򎱳ˏ]tN/0fu1>+-̇<^gy^ziBa{V7}g߾3شlm +KZiKN N[źyB{'R(& [331SLN %g+JBn*{`YXxW[19?/mQioIV&˛Ge vsxҘjsc%hb~'z5ܜ,) 'fI~<.>;ϳq˕_Z@Y>i'-x1#C7yЦ )zR՟E*mk;14Ky|gÒ vۻwcO3aX`LX;5 E'L)az`K+ 5j^VJ}~͈VF4 Pi4\-ٸӐ8;E1}ƬThg^>~v^j_>bnczk"5ݍiha`79*J 3M9>҆wL!ݶW8m [(NݧG;al ETRj+'?ҕҩ+w{FפnJG])i`M-R_fjon8S㯩׫65kat5AYz0&ٷ+kqj8ZmVQN¦FaoQuwIL7eЏXAM S LF'H5160ڔf禴 Ƹ;)? ?up_ٽzy R=Pj'4z+ݧꁊwMow{wa:~Hcz C͠-Zljȯ;l@0Ycy.4YF{IMY!9%% :@qfF9e``eR&W<[3 )P3hσybNsiJδwx7b4%#- Tq'؏x c--TO{l>L\>Uh5TM5p=?q峲i` =6+q6QØtV3UѪ j`Bl`mG6up+UhLP x .\5RN|`,#&dۮ-u<$Iz6%cB׿vDaakTֲR.5j AΏw%M8NS끳 dz&Pl$Ϲ &\%6\=Jf*ދ*>C-@U#{L&scˏ`˟n.Owg~>?9u,F }H_W6#,|-^ ߿cOS:53pN sf*1\ب7μykl':2b tK|zLݪk|mM Ez: JM~Ԭ絚T=#b95f=6@J içf=S}i_֬f= j]d,|Bg^ѬlUVz*d:[UҪB^o5mUܣFct֎~nt`'^;򁲕hh?(\sξD:JTн)Dׁtݗ&68 T N9nn<>]FND/JSq"&4p.auѝlp$cL8_t_)u&ĬIFoT XbVZ;$֗q'JII~ڛIXqs{!%V0ar*&b#O:4/hrjRzfrZ|vY&_uǙPW㧜n ǫ>8tJI %ߌ)peb` ()(iN LAm %?M6b(;N+5qE%Tэo` O:vd[VXڏb`2AZ2=ZBˊ*Re!"ړ!VG Q2D-;V3Dd!2T1D7ȍ0#vW•nU5lw+AyDlcgVTϺz+r` #rBiI1<ґ$f@O?:ƻ _T!2г9026^\;2_nbÊT.5nR)|(;$?bxJ뙿zNs('[oD[9M琤˟/)jyyfmtypp.sb|}/Ggv*XNS`9V%\%uS/o Bk*]`M.zt==9O鬓oh*θ9dN,wuE^|Gb% ),'+Q߈^L(g'A (1r \e]H BrB+Ffq$eZzL1rX%\UUhU;@+(imZg??A[o?DT.Gр3VH7157g`l@GX8_kU#M֩]j'BMQw.4rz|;&{)+Ѫpy6Uo#<9&b R3qBkzfbۿuw&ow0eqێDџoQ?@VKZy`ݝѼ]zW-׻qн~ۦ>n~>Onfk .._ SǛ]3,-:yl uskt.^wsҮ[]w!4>|XEyuWgBTT TwWj,tzVwWΪk0]>ϟ]"*(> D>_}fphQ˫ۡ-/A_V:ˈ",J̣)Sus-&|L/A7xwyxª.ѓOʖ=(a^ryb[вb@!LjpRy@_Zs #_nPu4aʟ/gyKbO+}\\F/ʉ]M&mEZ)MQF[O:Yk/w/K_m;! "_nJy)2l`[W'cAE)n(2II94=M8|d`C|ʂ{!$dcX>O>rUd"4/ 2:$MnX@0gdnFo< d)SĘ!|%Z,1n&Z-9)"#)3q #9Ӑh&x'SAJ 焬xy~~sj,5R1.i#bF&ި ȤD'Ne>389 wYZ!iɌp QQ;p#BM<:I2ȲuM 3m h0K'y= {N+L@ 1 Ʈ6TSZH\@xt09QA0!C^hkCxИUJ445K1f"$ C<Q3NC ְ^B9嘓/T~?|#FDMHyt+mC.,c91QvJTfU$rhGAϱ S=%&UEJa:3VSa0sИ!EL,,1j8Ludv !xcRuIm %q ehLk*VI$46J`[<I^(÷>0F3Z> jiW{T0a*2AV:*K "@y 1y6VÄ1غ[b 鬤 h3ԈOU 25I$|E =J$,H+ыp6y^1@須#7,Ǵqɹ*U&Nr `R+H7YIƝ`3j I `Mn vėiwd{=z܀PCXe(e⠓t 曎4J`4FƬ \UV.56AC,_zdǒh,'")!ܭ.X|Ih$dip(eLUC*pD;`* 0H=_BU뵸x.ADۛ.YxxO 3 OӬy3JJa3[=la?k;ziŨ҇.uԿQp Q'=eGoJβQ4(B٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:jԁuȨ[ jbIh9vNB)u6|F'٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:٨:GkI@T:'Fc/Ƣ݌:@+uRl:iQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uQ'uƨsk;<]z'Gi%hjc=׾׻?ôYWNp)yx~h꿋z<>5799v;+ݡo5$kd:Jk6kNT򎀥R1ѕEzv,Zlzߒt% xQ&JϢ=MU K+ؗ`as57¯7 (E]?~?LLloVSb8T.qv7| ץ5:=V*$1 %4bMni}]z1=C* L!1S]+ÑJHzښ 8!h0=w&Г7`r9Uoo(68M<åזo~yf25-NBldZNXu,'\.OӳQzk/%K!I[I7XY)iW]Bx VSrXlVSb#`fr L3}ؙ-"cK:VhX{yIkܻ&R}F_C}M-e" O <4BռO%͆h 󡿊2Ү ԴÂJz8 beر} G(Ι$)Ld\Oz (QRm&NF gQ51 ̴'Fl~!OlWNlV/#Ps*H&Fɹ:ZQЃˍ2t0ʮD*#`1Zێ)t% ʮmQo-QG/mڎvH`'uF`'vE`[-ek]G6P,m-^7OO7-Xfh{*$VgINzϑcėWiKP~_|oηDŻq0I ~>פQMx`*->fiSvh:ּz;zŵ䛿971^XK]UlW**<7HW0)`xg*+t?]%Z;շCWl*3tUB+UBr1-ҕ!.+Fn ]%tڧNW eVW$]YJ]9ZjJCWb[/N 劚vաj+JtZCWۡl{t%+걷R%]"3t2Jh:vJ}l*+bw=\QY/`KKy G3FE{W-f%w(7]_$em')nG륧g.sh< 8,Ž{gŧ`G+w->L{y/.gip8IÑ<;7ҔJ[_HnVu܏xF,͐(,xޫww^f}5]J(!lr9N~垫~WFR^oFZnlVd'09Ϗ(Z.zӶkټA?nP- Lۄfri ZR_E"+'ZÜ 'wZE[留Gn R4yإ=g͕Kc4S_+פR֊LOB+dE^*tzE yYgUa]PXz;.#HyeآE*L$BiFl*qbq[iݩv-yR+) uǦ;!׼K3vdgfпV/ٷOplޖ+ >BB/Wx2v'9F(y[(eDv!"d-"^/lY̆ӝBU,{{3]gp3-ڗ.[p ~Z묿 ;U6%G–=o5žp#PMwl\]a~w0vcWxesoN͛4P d .W}_&_RfZq$Wnt?cXwJ}u^½p5oul]Y2}:3/?ܺoe6{\:ޯ76ZmMn7rl~+el,_1v .-}N[^\ ",>SRN5=wLHKQc:Z2j/_Z1/|eT:DJS.Syb gAc |ҐQbc7SG(w/><;ּ_ C}M$͈~ IIp~\ƃ7GG㏣OF77?jVgL6#>y[,ΜnM[v]|yzwVdӖM9OE nbg.S>WzѰj/j2|hz1zζ1^j5C֍Ui~sQ*{/G6Cour+7c,靲t }}U8P\|6S?jӦJjiZ7eozuEGs ҐA(Zww5Ua1De4(Ȩ9eY|Rf|*mIw9Q+b*D'M;0j֢֙u^ip׍~5JgXAN̢8Le|V^ 7b'H\Aՙ V1ä"N6bqm2YxǏx~겧sY45,򿾿mVx`m1΋`ˁFFHdFo!pԶ |"&_򼉉l! 3r]xbg{%yxY;M8K =18*61{˃$:f)E+|\EF]!"^f$z1ET( AϕFt P-YZhDJ:d:ZP2͞ \X`>$< ̜[y;{cpT,ǁOe`-mh|)yn$agb) YZq>jHR\ݓp_=P<%Pb%8*kKUI1Nΰ buf{L^TkHDdV$Qx:s1*'2U{4s.#,LV >Õvg2bqdT& _ 58݋rC*-ʸ%Uִ1Z|>FSwMZ8dꏗWr:fWl\`ÿ^7y%9n%u8/;vꞩ8Vk~u?$91hoePTG@;iaBI Vy6%0 ^`(F+k&}:(;)bRDIg|$Ξ+_-QafיAQd<$ gW 28JbRC7?s "^/E0Vn?&;Db/9ރ0C;HZ/ji=knTZ`,hࣝ:ExHN7A4>?Yv"EXd{\<%d1{ɢ$DҬk80^l ܦKE2G ʫ1qL jKjFK()de\$[ gf R&4R:>I׵C;E-wϸ#-\JΞuGaLH pw`To>câvBCW_WhM/ 0Vn?&v'ꇋq2C <$fyͬ肯 ח|zT˟q#["v8vQ47*x0v7DzcPj%{ YÛ*pE8 'l F82ڈ& 9 ^h(礐fAa|U݀@h5Oh^̂mLLJ7wߥ6s'WƒČ!cHqù 1G4(eˤ"aҐ'ZGਝT8qaQ]o,V<ң D 3yoGdHvH8}-ܔuGCPSY>cvOD^<<}ϫ/~N?E; &3بu1+Tj-qX:r4`+\(y2 ;ӂQ;-(,7jNq~z ~ɣWC94 ?=2tN/7A3` ߨ?4DLl&5 hn1mh&gAڈEGSz0Sbݤ#9@'Mcf l9BCޖk˗%9hh3`<{k"i!%F*RO|$G19rO_oN'DS+} %38\*=18*\7 .('14$e$!JpփzB؂P]K^|H7)ϟۛh9LGB4-V(eWJrnf25A\k_#jQie4N/\-~kd,5WS$r)D\p}>%mG?q1Y.fL`'QkMcR-kr#jܑx&kV̨6mPMGO< rF}4!Y~,ߢ7* ˎ|^.Ar~!؊̋YFM<{^,S.CTTHIBGXσ!%hMOP0Kk]%͒ʔHWZ-5Ljt{t*Vi>Vjp~w#Z̑ %N:PY#ýȏv:o[_ç\ߡ]GwG;tch|m'Qxeb<ډ xH|?XSOX?uY1E|SmePZN=暴Sk`K fHM#q:QBk9BY]Y}~'a'xHQɥlfQ}嚴wOEMlYPwDQF}a` rznxȞЮFuOP+\t`$[[jߠ1=uDO%2.)]N, ;}*Gr'KLsDwU Qd&bRo*Gf>v9ǘE`d Xps\ A|H`_]?."hn4"(79RAG3)j~NDO'c @!tL]2:M`l\"hB`dD&n?cZT҇Kj;/k9QS fO\Jw! i`&c}o?Һ |*wu[͋tu '\\P`4ňG0SkxKnsh ?WƅrsW^fk%[n`śy#RHIu}&;9Q`̣W,BhV<(,v<4&j$'^8Wx ?/NAOek-gנ?[3;_5Ҩ@a(NDd.nӊ'2LOFӢQfXӘ:{m:)LdJj1p5\kǸ{#,FOeSG$=h4;Yto}۔r.)|5{o`7V7ď)Ho[![ȨMl6qk ܧg7~˴Z~r3jknɡ!-^9h_`=boo}&7)ZFrڋ j{W}?.bIuȔ"BP>yjG@Od9Q5߭:{m %wKgb] Nm)mo| V |*og̼#`DTɔȮc(8J|z>|6bcBw@iaQ{SGTx<@e ZrQ-<. B65fihgsMHx+5"UG0@r!rͭӈNM~NU -{hGmܘyKMr1PlmBq=3og)mlW(#j3.4 /9D ?9KF?yW8n1/i2.H."ɇHukm{,{fWlI%nS9,ИK2XE˥v G11ƃܠ\À ѭ_vnpXvz32V\ʛixGCO|հyqAK>1aE,~\'sK2cr^CL&\CJvCNEQCFu fيEZΉϗ/(oPńk8&cG ѫXI#b'wˤx|Zk n< MrO9s{+T yt< ۽)2 EiM+D(E~>y%eQt]9/@*o0-7?280"hdd,E g6*ekhPQ ?~Ct7ٖt#x"~pM@(V6 *ֈ(+'U26CU\.@bz))I?uki\or4fp֥/=Y&ĀAXD2rAl8{|ˣ5ctm S.lS_OԊ)M`qu. <$ aBּ <1F@_9"P'1;@("!:͎ou54I7 uE `r\/^Uu#pӄBWRW#Jn0<x8 ;"M /UPhҿctӿQN Wb/zgNa| eF6YdÄ)`&ՠH*_Vf{Y#jQW{PͫCwLCF!sH2t\%YdlEcoy"4eR s_v xܝz` ̛~3%T j4؄J[CtfRʘ <KlH@{[/FѮ0:bY` s %q.P*y_ҩ[a$$#(0DT*VQ{nyT,7›wɌM7f; GG' 0P7Biq/TDXe A4P$ULDXI`Rl;^ $2NK7~,mY_=W -o)F媊΅üm8oj gS"گ^l\>^4.c:]ۯpZ{AvQ/i _g{wzbVl`fs;:PSh"@\IpQ{/1>yvpM]?u ; t8߼X1$Od-MϬ?r7=UƝFnWN{s4< x(!¬o *4L@/z"_"k1Yz] b_+¸~W~a[1R5@GKY No\yT}cJXS63(Nכ+gTBQa/ʹޢנ"*ZQ̗r${KSn+wN]jl7󲚍*T0'_zK Tۯw8!Flq;H2H\2-`+ና$d!eI&RRbKvS;5Q0ă:WQe㞇 "9waFnC?0|9 (;1w|wR{(LJTYFXYf,=1CpGeT΋S'Ƿ c$j@F85WE&ggOڠ/"j-~ (bED-ۼ2y!de5!Q958)њz(""h-(I)hϟ:kRxw?Z־uiPRU%ƮZ#W?PmЋK|bM;&EkI1XY1 <Bm|bN !ײF0 D[&WG%B|PLE;EkN1蠘J1֟Gѻ੉c K4؅t\G.2psxG$OBR45λ@k 1ZQeW **jշbF5 C:I@0Bv731RnyQuN=[Zf t=C]e`kjAb`EG+ @-^[{!T}Rx_4dIeXTԈ0j\ &">闬>M M1cig o/Ү_yIn2?y Rr*B%.:BŠt` (;p$ >q[5ҍ?}1sfLIJj.$J)Y(\|*jB@"ƪZQԦrPm] t#eE54u%W4%bC.Ԧ.|YU0(D^Z*8Vx[VX*p RqDYFT@^aFlu&x^ĕ"{M5ME$%x&>`sSȻ4RaOyrڵk? ,>K̺h+9k.{Tx7mxpc?FeRǘ6,MObH0D84=&brvʊH\G=MMEïAEEc D(u؍RDbO61V*bSU/lwccp;?Ǝ=Dj{MrOh3l}pTW"+U\,S#E83<܄& Bd,k]5 YbY N=^DAtj $͟~1PB16i+sB[Vx|j)+mIXoyc -e, \'rrg 0C%ƺX}1Y"pu%cXxLM3^kT0;Y1ă$̣ `N]n[ -1#o_B&<,V'$!e6ѭo/%ޝ= gbvX1v/Kq]ͫ^fΐz߭?UtJ.7?BFZQs@\6+f-Uzûae|X?>z^ˋg y3iC4{ˇfvCl_X6Br3w,Y^;A$<<}T\ fMFӌy:E*5xẰ;Wz]xР"?h˙iFǂoMd*Q 3t1"# BGJIe=0?a焷2)_G~45U,_,ߕNj|ދ/vk37Ų%v6_)3{yY tnyga+eXp.vw:z y߸.7hYQc૯.b_4W ;ʭ赹Pu꾉.A5ho״i3p=~חǧu?ۿtv7gWfX}_G&"]+]}Tt^v;9ϹX! X =w=0X ښmj*-{Ǎݚnf?-+U ~PEw0ԗ0(@YfB]Yfn^aQJw=jWWwG?'ؒC M>? ~s$>45&VZ:eࢰ&gFJR|{|^=)j&Q{ ,a$+*Ų0Zw2l^WjMïENcF\zDUprԗ^_QvW|?ngɐO+L~>#^dST4/a^^[T?7pZ H3|ٮÒ؛}J0GW}]0cǣajTZ\э3zkb'8A=VBDuP?4G)Ukj=Re1ؚEi0/2?W+4q&ǒ~G}H3|3{O~1ATrEP'pE~ɣ֥~eluTWۙȌlgLUG֖<$*"px(o#bc{c#ݟ:l˿_J壛Eyck>b0!2pI" *_VņmNo y^|!!81|;*D|i2J-6pMtȉ v\çW|`xkI|8f.U2>j1NcH۬2*&)eQd"ўOk|A[Gxu'ZEsy~mT¼?1t])ZadWQnX(._.O8d  _9˳@6g|;l[QxLA݁K "+%a 0fA26x qnnZЇg/9DM0OʉRnf rVi(Ĵ AB%׋nϫ5qOzwk`%R$D"0%' ?)j5QF#?Ʒ+⪷6 srq|іT??[Cg8@X`!+Bsc2-ь"\?Occ}U7=#Ppfe*4wMY/#DP=bg$C-ZyԖJ5u|уznD[$1ѕjnt0 3 l˜=-g^K,*0k`I;n .x״-YaT'gK4a^ {Нhv4 E LEų 0Ϋ 1 Qkmg,c~G4ޟ(юQnB9Ӆ9ItaVê)> l0d{fa-· Y0p˜V7V–}4<= sos&R!UT|&e(Ey$HxwrHśwwjDyc[  [[Zjʕhл~O֓wVd<]BZ"%zŻ |)v`T-An҅3t^CjN҇8xu3LCpjU@v-=8$†$+X8|,+jA v3jᄄ8MpNQjE7͡IzFㅤ2 UxW|^;4R(B^47*jHxSӽOvOLE-!g"v Qa&`zB'Bh?VS~BD}?Qs__ۋ/l-~ئczӻ[sz|h7k]-"o\SH'c Z_|}6~c8oģ?\z0R!.ϹIN1\SCLBqgDe$(rb]̛Y|,t/74d |ԇEWTE؟˴xx?lw7C\H)o#o|TG8pHAI~Jjn18>kuƅE#VgR}66)܎Wax*G۟Q}kπ)WFʷ+h#Xj%ٲF@rL(.y9r.M~"V᠚~>~M?~[{vlPg#)_/o: rT4tT|\9|Ё0CB0*7rЕϯAG/Zt^yWG)4t3M./ U0\%` rP{Sw[% ;}t`ʍ})V8IX28E!\!n 41"sR:kؖ<8<"%Pϼ1޿(<Ѝ.|IQQK|}:y,nt2r.L(J'J~M /9Hә!ԪM*+␢N8sಌs!ީg50q_(L BƭVeE5HMcBR%sj8v$!*vg:yCJg.0be` Pm; m51M4\`Bi\j1!GiBk0w[te&%3ՉP㋬O U{),Gg \d%?>Nօ2QZQ&4N(nx,r:Q`bu;oe)0oH[iĨ 3,#V"e4. 1vhY87^T btT`F i"^,G~ՙ+Ğ(1j-ˁ6Hr03E\ 3m3b?Ga,(~ae@Gn<}4Yx"[],W CCsŞUJ,Em5mY'GzO4&ߦ7 #aCy+M>x[WqxjYX Tz~ca4o~G}R_>aJH MfDi1EMsN0;ѲpjX:Ooh}9~fy kYxRՄKn@EkYj X"s#ף^22F30f24* u2Ѫp$Ƨ߸m$?HѲp~T[kr9X YkeIpډ$NִhUɹ)r8sƈZ#2ag 7_ ZK-£Rģq$<~ CBrn'gݠx0 BK\Gqð ԣ}vws>,!cwr:ԶrsD P4&W,CqsdV VuED#P4dmBES9BEx[#=cTa0gۗ/n4$w?%wKLi*C% cvKIQZKp!ߍۭH@`>[˴&a`1`W\fL.Y+̍Ȉ=^Xlm}(Z`c%JrXo~^]PGDUFW|J_*~xfm;1mqDDL7AM"7VjV NB),[kŴC|ZpFι.(r *>qȊœةLK$WWLkˀ❯]ےDD-Yc[SA:wGpZcwF:H@k<8ݸ¸\_D߮9H28PƄuXB,/YBr.Ŏdh]AL ' 7wgčѳ @ۛ ߠ ,fLKFSAKw~ݗfgwfpZ"ǞPlGIP %= XFɑGcMd#cHl9ll.6A%ؿ.e}QݾMᤖmPb+/oN-éb g#Po|uG˳mjt{0y?e`^(N6uBZDcIGL>Zj68$5dU,^xߥ"-)rl68m긃 X2ƻQV4rD\+~vj5{ eJMwЬ^m"e 1usD7&XXO1=,m\}ʊJIk^H GM8Y-7b| ׃d*TOj8 RO8%]ü3ä6F:pҹ[`yFepd$ñ~[aMjʀTh9!~|/ ^C.8SBsQP !)1uJ2l@ 6/$'߃,#Pk, `i7iTƇmj/hD0B*c@0LCcc ) 5ߌ] Q~%@YC28j{pUb 0 ,Fep*9mj3ʑf6/hp!)=6F]p(.Ѡr~Yn{!'P(}LPqh+o3Q'YH#qQ̢ПÕ#5J^iUKGk%;lPx67Y0޲b |uqaݩDuGdJl>j6LL)bh`j 0t,fv_#cհQ/)>7`n7[x}|nlJGndz@4Zw f%B &ƀO.SrB\'m): TTO~hSfj wM4 ~6ֹ#$YC.8 o٭w#R=bhh TG2,L-=WG>?=. jl(.PR1% dřo<,^-;/,霤AdS[edaq|_N[cYF]pdzSE1լ[khT wv6$E t2jGfQbTn)UƊw(?(x[2* Qw[mac< c"g< &$g8!vHLԥ@(Y',cAgŘww{@o 6 @`\|hԄO`%IV}*BYqBͲШ N?F/3 uY uR\Mj޵)8GM>O&|k ~-*0 jRL254*S sQ,MF)/|2-^TfG_`vL`2*Qf%J^qzEE9)p-4 IccmJ@B!>&beXnO7[4ֈj l`dBJkШ #8Jwt.s9YƄB* -ʍ ^=ZѨ Ny e.Q^E}ӠPq"“uZnyy SV54*S'ՍQn+;1؉uNY%]>_@ 2m1 P"edT㎏9@Ii$F|0~c< {0T}gJ1FLC-](]'v.vaT^?}*ZTb3A8-XySB=ݢW4*SAsBh`%,F Q:zZ:a<,5Y5hh׺kNO'~ ?^"VfMOק9CEQki hes.ľoꟵqQRY'GY _,/YVY}4)Qٜl#Ƭǯ >1j7)lxU&wsTS&95Ш E嵏/./ۜf$QBo͊3բt#`hzE cqhTGm=L>qЏ>oґDU$H#aꬦ& Rplv]+F`yuk ($AQVbJ ZT 54*GEyN54*éF$C1AѢ%AofN>-83M#.Y HɲF~#4Glfĝ7h/ySC C!f`3΃#A2Z@SJx6=7w~}nN{\l밶zͤgF) k?0~;<t,/{_i'sDڏ`Am2N''jglF> =q"=MFyzG [i9Mʰ lwGclߞJ/ӝo%e.Buӟ'yӯ>f:Odsf7;OK ?ߙ韫Gwwftv3pfp;SՒs"a,f$8e(h"l^l"~vM0v*HjzBXYL^jʈhA #(H8 ~p߾:~{T`2HRLN-zg՘IDk1h$Y|dxpUP`LQ-Ja-3sgR\@+!=V %X>8uP N޼zA WnP'S)G[M*Ffd j0Ԁ8T8j$<`<ݍRA=^FJҒ* DĆ&0qFDA2nr"x%T8NxWkfp,UC@&w;NaAh6|w? v<> q8Ij}G~_|ij?x<E.Lߖ^( Obm -v]NK(IB9,%ay+%RZ:y?>,^+k)A{vb wıJ(s[Y^Zps>?g`p>߽Wo_\#}xOeu30Eм9J2$T3@RE/4Qߙ~ yARnՠyȹMsA*DcC))$UBuhSJ0h\ "0sSk øt_N4A0ɵ'\ neh|aHvJea!Ц#룰7s0M%v.{0h,̍78m#k֐C|7m=uL&͋L"!)۲8  `bnmAYw)k.7t+. ʃ[_|z#vŢ49t8L?,} }FSd*7V+y]^jzD}* #:o^hx EK 彉Ǒp@TT?>y4;ǝou?ٞ(&d;";y^df~P7ʺ3QRLtBpW BG8ʣ^5`+dKWrA7K!*6;P?wxʋӏÀy vFW0xKo2{F< x0x1_8{O3 kbҌ;N|~y,~v'b Ee-Dz"y ea٦dgߡSI?*+޹s10/'d; r}MԵ۹_w|9Զw &1 羃#aJ]~x@ƒ\ށpW%5xuN3OObޕ4.]CUE,BI7ʯ,H_WF9`*3jNMM-x0F6ai=`:ʺgQUzu`.T&*1 7L ְ/MYm?kT0{㓽æ S5`盵 V[ˇR&@4*f^7n|scջ=z6"mOؗxVl~c vX;=Ђ=+ߏ4Nm[nNS)1?6"f]ο(o;A tt5&1,O&?"9YXJjSl)&bycX~w.t{ cW=~N<VAgں`4 ƚyjn`H KCoYTz_-g-Y]ۊ E*(J_vNlHb_gܼC~y\1_^ލ/]0?q:'oUxTDDxzG8 ԙ/W|}-:L<:ȤC6Dm?P)zķM4`p/Ϛ_@T>S:k(X;x+q':ڲ><*IѺ5L˱,yy&U5j:mpx UǵV S2L]0u.e Sa2L]0u"RdX Ka)#Ò2@pn d 7؁pS3,Ӑtm:6yMGަ#oӑtm:6yMGަ#oӑtm:6yMGަ#oӑtM'i9:=Fs}bb͖Y;a:SfY;eNS#y^2*Ї"e/'-,t ox!Cg#%Yx5FWu+ǟϊ/>;K*[ߔ^$Ͼ|8WJJg`{, R{k{aS)=wԢU2uTƪ(20%?@NDkT~BmgYaWm:9̞%ٴ+^vg ']fe&dNZW5UVr'duJ0JGAsA@g:†#"ySS"QsecE0A.lɒp-gu5jR*bCL6f8缞g*KQ|.쪊\?舿h{sO~8B/GoDQ-U3NI 3>I#֌\CG;:hF-ѶKmR7ҜDM0S7V`@Hh)rJз#@a^@ /0P.fy"Gf+5r)bw c]f/+'#})Yͻ"**C~G,!'GJ6H >^r FtpI*'a?ܪC-w׹\qB~i=b<~ANm&mQ7Nyl]g/wb*݁G?}-bf*ٞ/}V 51tG(&.+]͋  IaOuK\:tblO n i q# ڗ#ko8Ќo.,!GS?[X%[lL& +fhq&owqWjE­p:_Ye1b!Q $䈸oWHۓSLnMs7."k*+>\ 4b6! JLAyhP]tJleF, dF,KƪʌX2#̈Z.'HL$'  2{̞;&#Kvcg$ّǻ'6ˣ 2vtyY?yYmGfyYmGG yYeɳ!{Lz]<GZ4$L QyVidApȶ'^׬ e߫-icfXx|r>5S4PyX-M1SUtΫ7]%[pz )K? i ktצ+]smU%J 2dאcQGlZ*4K8_ɪzE(Qȣ]o5˕۳lQl3BOi"=@ՖZ3WYSl}jO.wjhqZ̸OЅlHGCh1r"rgSpW~oK.OM䮓-6Pv\Tђr 6EU7 6ՖB~W6$:Mf˵gF|bl5 %'wfȵUsWv,9w,9ؐ;p!؁~lY•^:ɽВ6{ٖfO)nmp[mcbXQ| 05}+'6K5s Lo\͕fN*0s@Ӷk%3G[\͜+sN{OqLWopw(F%X (^9Dܐ2σ59ҶEB c}^p RjLPJ+HZق.NFdWԐSg F*)B2\ږsS}"F9x).ДA\B\U "! h:HKBK˨ V1EB K磉9]gBV HeA[| )7.V,Jg +pQ3␓Nb| )w2LNi.b\EQ!9hKX *2V:[.6ǐGvkRqmg HH`]z.?Dng=Z TWXC-ͰW,jDĐx &7#vN߾U=^]Sjr29ÓWGgGcVtwUl>j^Uj^bd;֯͠!PJKI2^6|w0G,ZApY 8iYb+t.mw[Y]DH\#LCͥUvK۽EϥmBeJ}}V)>I߷SNζRWNζVT̚"9Crk3*"tp [zW P[^;\=\\qj^E3[)E }:iRu4BWGW4r娚Ug3?.y&ShzSxGAd`+øddnCtI0߱LPVsOϼf^^_ÿN \Βdc"69˲i=z㈼+Ce>`,/FWXgే؍w:g=h4Td4K_/uuVZ2f3C>-oJg5YmoL}dk+˨T}Z佁+TWӾjk_=\n`WWqm v=bWv}+W W \]oN)SRՙ[)K}N]sWi^ҼdlWfJj.Ѿ-lhݵeWPf+ً۩kz-\QKAp=BpOp ;n{W^\vQ:p+ml,IY?_1TBkx潹..\%ڸ]Ogk?%?~{tp|͇~q,3~^}rsYTTRDK%2zAr=ьrE|?kwQB㞭\~) |?yv[-n Xlzwox1=fxOk z=2Zv*1K,~ NXO1aͫ7"66C yFP/b}8n\`oC#W`߅y axq~D0W,Ϋ0lVR՟t oJmQ8M>.m>El8k_|VA8^޾hró#}Dӟ~8YvЖAwIP%;X2:&T|yUMK^}VJh,} zv~vƸ9;wF3Nn_2:CAd<\gN:c~sKė=,ޗ}q {瑝.{4yd IfKq\\O.c&0)fdZyau2JG#9r Mm! Ciu"r|{~|4q1[<5{;FSQqwwYZN#9-gqκ͘׷GNg? 2Ҹ]<%qZ"%U$֓\K%F0>PL?tҙ |f3|档mGhQ{ݱxT7 '<U&^p{pfd>=<״xU><ʷo#hZ'r_SY}3]$1I}`>ji1 ֖j`RzǢ]!c,(ti% e5,->`l3Sj)-yXbD[cAKr\tdRN˙ֆ繁s p8h_"цSB8v{u,L᳧q/fϼ"Gatk9F*֣xPXQ՛xTr.VxRa.╠iK5/pښCpZ6 puD-)WWV8aWp3vBuu^ppZ*6]%\Y&U(59L&Άrr(1`aղW VKe*wװp6 uȉ%#28%E4Zo?da|M8yV/-e ^ |q>1=}԰ MG:tQ*BU-JWY:Al_ l'gdk3.dlXu9gC6ӮݛrNzrҎGqLi|/Q .BCxs0l[#t_O-wצ[m7Y.o acukWkt2Ot&2[IBZÛuTv  ӧ+Ij}CT|'$Eroֆ]#ycA&GsD0-@\Ox%,. be}35pj]zOWED*W7TGuF9sG\iF5WT@Km:zt8ck= ,^QQOtTS3oQE?uq"-5-eӝ Mxûyxߏ3xbΈp}cҫYx}MXEyG2.b/{eE}ڂHA7rHQŦ8dK3I 8UYbwDw w>A$̿tF+oK= eJ)ڇ?8O.9!矎xIq۪}+VqWQg-a;&CZϑS(L9!Q/vWm:l`[3bC|ooۗȃȄ77rٍo.#\.b134kn(p b<䃻7]k̮6 2=h14"SAosXjur򌦇Ҟ?alZl4{kBA5.qXm uɤlcȖyX9NK2kT!VbhL|h!Z֫7֚M?zPR~D~X.Ø3$JP'җ ౘ1ӹ`QИӷ16TW(:ܸlbA_hbQ޿oF_d# rumĵІ;B508[Ta VS. bHƪT)Ζg,9f͘R)cBq|/X9Mg?ʻ?_I1f]+-U^R[?cr2BOEVVUь:9^Z3zosfb;fL@r3Z-E_u}lNkaOnadkQ;ŖA\ȾT~X;O2)Kh3wk56G`qQ!ɶc##|օ#D@Ϧ"X',FM%ن1dg Ē`g{nr,B`k "J_k ]05v:hPeXX@wN l)ohbʴ6ykX[ =h A@ka&h ߚvƲHq6W-%ʅ*C ]@2Wm2c/(mi(Vb!%l CX 3<Gvc5_\B$`ۢD(X(rk-BF TۣAv٥#z|hp&4FL!TMY2)DPܿܣE{7uirlrFs"h>q8hH;,'ka /} w4`wo%uQLRˏ^NL acGO}-B̔ZXI| (#K<`" n;@v+Y䌁!Z }B Uf&5,:f2XaxUu^ 4M6L輮a>DTNrrg&D _u5Bn6H܉m  WWUS]ߙޒu`(7[cEbP; e///7wW >ن)>lb-VXA8;KK~,EȋU}L!-.x W5MLp5r1 j(]Wۖc)p ]cD;v@ n!z=jC [jw ,hKٯ}A:G 4Q6Hә`D td%n;e6*I@4#7Jkh@2?AjDao:<>åUeXJNS cE9aQg4h!0c{;r'nj-ٛO4=k#ipL΂Zmr03t oP@.`OɇUkRҔ"j9YeTpeŢH$!.tBqO!j)C,0H5ozFW֫޳(+>m7rwleWjI\s-*g7do#G~qᓓ>~p]=•պn=ZJen3> û6v^޽b;)sq>Gd\s|ȅ3S7Yn,)[J2&"O7yK_緑+8a?w`wW:aO=#yKOd}6@h'뛞pk#;Yլf}5Y_jWլf}5Y_jWլf}5Y_jWլf}5Y_jWլf}5Y_jWլf}5YoMO =w&G= .uCq:L|ZAt@P'iPAg9 hP[u3C錂:_yvW hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A 8o)^s;Aa3A J'J9Ǡ<}_1Ͷx`p,-8vrE!86+ZO>W/:\_^|Ctl}eƭЕ+A5yt弍o.du} |B.k ԍxU🯮.Qѿɳ|й\Bi?~^uv׸6v^޽b ]b|&,.<|[e?a;,.&'zϸ%Kޥolyuۛoխ0\_Z_oqoGo[1 ^6*V' e?Fݰ,ݥ'ڽYyX4;";.9rs'A2{WDlฝ]7} &{o` EΙ7DWV+7xAO*]%]Vch;bAPIJWgHW>q+on{%siCt$3(pi3?yQJrop +9C]F{t%(=U:J)S^Gf] [+ANVMV:C% ] ܰ$M'os&ڿ#OdC::]8|bap=.]6ZOt;6'DWl ] \[+AP:㕮ΐo#})] j? wٓWWvCt.ЕMZ'Yҕ7j] 6CWL ] crʠΑ59 UX ] \ڌՕX·% ɛ+Ky2(hv t%(V:Jd6DWv ܸ3ٜvA9Ui[8VJtt%(ndN*dS^~qґ72 q] YsD:-Qz{|TO?tGk7DW8nn[+=ids+RyCtЕ~+t%hӕsؠtut孻w^tP!r/<3۹#pۊ_^hI[9&OiK|;n[+G9p`LJWgHWBV%=V`y3֝>] JRuut؄Cy;nxAéP&JWgHWp]E9#l.w.mZ*LveYVZOm#vڮXwp)M֓d+-GDĤsW^-hW)u5ĕb\ڽ:zZ+"ǕW0j}p2$\`P0r W^$f㷃+\xq[9* l]JZ\UR+يpUMw+ 7z1Z+! :\yWW+TVʿ\)CW^%ipSL•P0ru0eDW^%S WO )x*\yAM W\^Y NߕKx(jyqUJj q%J+,8 W^ ƻʺʫ WRR@0<\y+V~"W`pq8A/X+ ZBWiqUJj q!7muepdATvfxT,m51ܤye>6OG12r&(WC-%,[bZr5)K~tB!kÕoIȞ~$0yȋ rʑR*6%0l[iX;04.K}9 i՝8x^@\[Z(-`=rUfu=Ϙg-dsټJqɕ +-N KOV)KϜeJm쏯@k=òDa?%cs2x+,PNOvtJ%3vQgJOyLbq.KeVKqb^˝}Ԏ~[.ɩ|W8C3L;J4#ߎ?A,S~AG;2x OB~ihb.יf+_V&,~gV%ȓa~ْ Z/ ;A2`ɴ6p8oYi3 8n_mW3|N$T(yJLH齗/5 æK]lԄ/ |p!Ԟ.| x;1'??uN;ٛyߖU!6<+ۦEQڦ=v~f+ⶸe򢁧z$7+wycyӋD>?5.|f|}#_#A2: nJp[,k (aͩ\R 3egmf. FT/Z,/>Y1"TQugG/ Ւepۀg/R U/X3b*ʈ%T}ҫfF,IIӓMD04H_ilBMԺwݳBcg>qxsmG\к+%T-UJ^xt}oE=yR=8 (%c&v)sg.A9&_La$ 쟞w7SJmX_Pտ_\I0^bS"VTS9Tm&&yXpʗ&WPp*Yw\JX5X.z ylČx&fa:{7X + N0WJKܑrUzhnvkvi_.O\7۴ { ^XVN#Fx67F~Y#:dnR@(M(LxZ\جh'`•,Q0r W`\w\yD7ZC\AӏXH(%2\yʺmkE W X0 @8\Z-j?ϵFZ6ZC\ID i4s$!w~x>Jjqg,$J!p8sA.VR7ZJcpHޕx0Pp^w\y4fpEKn"L_=* +j&w[?USV45JQwi\WO5=\30`pR Z+R6ZG\$'Oc"e:\y/'q*5" W`!i@R`pj @µJxW+F< \`Nu0R]Z UTrD5'p]K8r+R7ZG\   wrFJ7Y+PnW@ / W D(X޻DWk+E!ͻ_zY 3ʫe*mpҌR•, p@-ǕW XIӳM$ūwUM\K8PI-+z JZ3UkpTc ^|* @-_լj*kp" BFZh8>TJ^ʭEbԂ,ULi6rJVPš\", &r %G4^eVAi"4rW^.ѡʫ]ՂTbjqV! 8 W^T b\{ʫ JTH L+/W3oJFP5ĕbǕ,I0rW^%n&#К4 V2\IFʫ*euĕDi@*\y+%)"*opҒßp M/wjKw >W^e]}C%M7T ޻&漢\|U+򮪩5+ꩦ@0F(\yX+ʫS 8(\` W^-uǕWYY X+L0 W^.šʫʅ2j}pŔ+/x6U@IJ*hp}ȗ+/X`pzI*j1µJ"\!Lp9:\yWR j q%$:+Le8rW]w\y:J#% W X|oJr%&j;J(izӫj$XU㪚\b\USW VRqFE\WO5=FJ&WPpjŪګ$`hD@WPpRQw\y6ZG\q'F'^ Ebh= rGJEVRJUJ/ͦ=AnSȋ kKR)?}1cfAfN05Z=skvm3 16ފ^mnfk%jkl/ZWl E8Qi)d>XD 2*hc8bbo쌇 I?ڊ':%dBX %X>(B4I8F>J%Pu U\&Ib\Ҕ5sfRC}P٭e䬢)ahJ|V0}IM( S´cZ֥J_~:ۯ'Dž #$$L:C c$M[Ŝpb=,&LR#1>|Pnvκ^oo$)XqDq,uS@0(+X(1eanRFS J@#b@4)_ ԈԤ]ƍ@pIC͂O eШIB9T%%H=Y]:;vv;eD8L,S-" L!($&bHXsK(.YRSgio~nW^vI(GIWw;ݳqBL  Z@6>?i\Z9ӝjY[8kB1=2߂u9^d P`\煓iN9lJ /J,3iՂ$b!JB)+OWD#Ҧxv bcfDВ$@yt)zt jUBKرUBI U/2vt`#np1z\ -y[wCylt%w+}S! nh ]\IY[**'HW*d k[CW .mVcRӎ"] &Im2z@%9xOrěy(_\D.;ښ";???YN\ |GMޟ"eJ.0?/)X?cW6y.oϊ\n~4^lzK@7WZ#Sg;}n#oÝT6;U;Fy,FGBR4 \͘{_+AMUyNMX;^ ER'Uz*08;^%ti'q0̠9Í""] ې>]Q̙S7|[69gA?F#PwV}Q7>ƟFOC U"]=/sgYCg?|^V'3E+>h*[~|]QsEt*TE]1zXp*)Azbt-B`c+j.8V+hVrT;Z(uC-{=02BN?vP0 O^{9"ͤ2 _0hJ{ًTI~t|>b"PLKFO!Izl|5,%En[}5= n^4l ɿ& 4E|e8@(( d3k,PV8dЁз9~^YsA2{ uG{8nCaiVc{iFjlYn&vmN.58zp pF{0x[:P |4fMPn-{9B"#L, 3D̴*h1%\ɣU}} lՍAo/fiz(Drto6IƧl7.ma(wkg͖1x<4) 4"+h\uGřb\{HDZC =55Q0!ӴbAJ;f2ʂµ$8zW_^/'i)wafޒK`np&'_|W*f=W&q]KTRT{Qxs4_͋z/~|/߿z "<^ nd/!k}!ET-&-uim=hi-_,9(Mժ鳭[1n#z ոi$E.xxP!0cX>AF [ic$ЩS*ZtY0a78CR.]WcMX>&#⭕R[S #{.BQ.iĄeRbKpj``=CD)U+ԢHuEY_Mt~k׉rZDQBD޽w)oÓ*ּn[(?}{Fȍ x㦨B-Q4`jv(bzsc̱l> Ii/9vf1|%gSԄmz>`ql=K'& ZroFaN5t٬xւ|C}[\-nNkLSNtzf@Ձ V)n9/,|"d3:pH,q}ar->ʊ_c\gG*,`Ѐ!Ynu '1beP\t֋ j$*8ʝ7\ eȘ8P=tHbaJ;xx㾃yf'|$4Gg?źlg+Ϗe4[]mwJlcKޢh85ѐ\) JzѐJ!hO02՚ӡ+z?9 TX04L`D$4x5 ;MF;^ukEG9/_>,|5}q!egE~4Y>t_cF㤇Zy\,?nt]ǏQ4C=] .iժ ->ݕP8řZ8HWOjՄ뎮 ] 5i#Jp m ]%;]%[L|t%9 `3JD;]% >ERkZDWOG]%Z~tP|CW+ѪXUB+~0PJwCWj˪W}NWMWLWՏDW;T]Jutuߪǂ ZDW 0o]%BZļ'HW9W'̌93JH`bYaf< s |ĝUڒѷ-=bYS0n_o /u}ʮ$Lj{ʼčγ`2^1՛wl\뿿~ZF(ڌ-2m|Fy#j&5zQDyޫYmhU|HZ&i&\q[F4 ->I(y7y#NIj-lN\oC$/Mǭ> Tb`mb  @{ Bi:_'Ɠ_n3OOKe&W˻IBry 75l6O鸞Zʜκ-#2e g&ƥTs'[(T)U$9c%8]˸)Jn:ӟ7֕CXk0gq5⢦6N, cCz@nyD6*.H(o7j}4t yo]c;nG6SQ{Y!u//A}!LE P=XX.4[*< epP)×2Ng0!XzV4 NϞ-CF>|^eFhCH(fAE+7j)RQS7MÞkkW#3˰ 0Vb0!6Rd< }\k=qHEB0]ܛ9 8Fx16eZaDD+C0!UXࡖ*Cc&P e4zl5ͭF`ZElWGv˫Q:yK1r%TH2O"sjAI 3ZCעƇ-|.^phPAcFPHނc9PP::k K>fdw|7{ M%a뱄 %i>QF(xPU0Z6EJYUqj|JF4L (ABLZj^D,pPk=IL?dh\X-2RRJZ[r ,BY˴[4A=o6BU9,Q 3!J1 Е8=6T+[CvJ= euh`ɇNcF11ʌ ; q֡'5! & ?$C ,',0=hsT1Q؁ Aф*Tz0k ƴJ^`H0A-#o7 vy9<+f@57]?1.@SAL C>@BdA ֱWMU(`GyRqk 5Qՠ`x*s\0(l0IeS!biT66 J@dj<A*<dZR2AA쭲KS1Q3D se(UbCKW/u/(詊HL+AӥVߣ.%e9$V^,7VUQc%ki"j(B()6Ja P6B(k)Zg22[zc #m6D @;k1C- 9@g@!t9 hs^uИXK4BiQi< LCЌHKGhإF@uF539(kh ?Aj!vGxc)EF WCQasWuCϢb ÉTIXsL[-C:b8K4YDJ Ӏ,ym`JUΪrݶ2+{# hߨn$Lz`tk{t7/2cmcHusmZi̽3BŴ+ v6b:5-2  @][ qtU  5T0 [gŬ׺`XCm58EMypY5X`4vƸA_φ0|Tk>sr8ݰ(^o* 9%pclns Q%\ EcO B`*LGxi '`Oc5a5q^ u&iX z~Er\B\ \1\;4\I^g#Jx>j)0j2#6 8e;sg̤0`~5"`~XXfT 9$abU}&"t٥#0h)nBDaCd6]NSjHҰ56WQmwEBCީ쌅E8Øe>B7˳q>_nEQFUN1'XyBhшPM]}ŗ2[; VѸxB6v6ޠо^_О>ޡ`/4z:WM&{RQS$xü#uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Qu{Di[7D&BԑmNiRJFDO#9'uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Qu4DԁQ7DR􏞨) ΧHiԑD!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"4VM{]g{riC>uZX{UP9-ˀ4_VmCbqHU~>R5׀PEfx`8QEWv\l t\ɺgW1_w s@ ZXL^OǓY3~4*les7J:ғeإ$&KI vn<;OIzVcheٌٓީV[w^OOۚ~>1,7O}7^bd5bhb}y]n("h^t2[:xƳiZt˓=m-t7 lt ޝקuymiz @_n* Xn'{=܂qus>tc(w k|ݛV[@d >T& ށo)~X`t #0MMd~T:yn[<)Ʈ":%YժKΉ+T9+]*r>3{Mޱ!L~}2[n ιՕٍxg\xngM KItH޶]X%HM٩V\-TUū;.} |½z ʊ߳Uh-V>bۨuL%Ǣkj]qUeK+<-i ZΘ;R7D\}R\%&ٶ'/U(ϑ BकfN i܅4 sx%)ޕܝfI:&⎔?p1sLO$R-<|Q랝[j9+$u*[2򷣖YL~]fJXIv'z6LNVz^`YK)I&^. ,49X,aOʉ.ud{٤<$J GĬ+jU7L5EyӢ2k8pnmFy%OdV^Oʛ(admrttt<}yԡ1a;\,ҽBϿ5NMʿ+BK(OaMi}duU8}Օo _/<g;iu7œߧz2}q[/v+QUpSOl3'eblU-)ӡfp3 6,Os{l<|vbmNO:`E'Zr,Zƾ\Ugog=|U'-a}dJ|O߿實{y2?}~&a1iK[*K%Ic4?;].&iu-Wt@SYuq:[]c,_?ۯϧ_>w?˯>='ϟ~%|_0/˒ՀӃMoѴiwѴtMQ{;jZ4&C oğ+hny;Ƹ5FX1[Sݼ։bmbc~q{m|hcUE.(YN̜Co0;,[-CiuVTJٷXBP} QUehce u:;ra ~3$MV`}b/7 ?Ʒ[+ 62?w^E\.7ET$-ߐ+V$,ƛ"n]?g3ᰃdBhwm99&ҙXH ƥTX&tIU ܫT"&^6r7Ⱦk׭m#{㉅]Z9tcw0Dޝ25MS˿6Obڠ*iM5B JiRKR+l6je:J ğSM/͓:C14"&bd3&۬ﯪ"#j~ oskW[vmR7,EضCHano4 Ǧ>b~ (0 v8, ĉIH2?j-Ɣ]5I 9&rc6.E 2q7-A'X+lFD_"D4#!!h]iV۵tH8clnoG_,MbA3|6.9iۓ7Wdݨ8:?j^ uoivqڑ\āBoΞLOL Lx[X3bTZJEO]C*y:V+&c\9ܪ~+^)Sa"'fperjU4c5k ֟˳DP%/$Qsx V MEҍIlM#&hONOy25%fw2P::׵ׁW, k0C܍˾JTQSbLVJDY~ۍ~w<;DJxit[0ܭZ{M߄!~elLKwJ%́#.!ѷԉ1ܯInt^bi0RYk K}: U?[=$F7rR'J4^D7Jǽkp}A//ejLYJxh8q~;:MI8 RąJ֐e9w7Urěe4 ɦZSѨIuim3wuKi}IJiКɷ'i[:^Lh $B;Y>P\Db .6>-*& Bt Ʊ{[fyPG/;eIyanΙF1mΥG{v߹(E\bTo1 MI_?ZSɱMČ4֒*E w05}Oc1fY<Y%p_;t~R^ޘ;"YSc b8rr>nRa)|`\ibwE~  X\pXI:Nu:5,, Ko ú<[-%Zy[TǕ[Q}xuy2X+Yԕ|蚞[)ʣՕGT"W]m = MKOΚ:+)}#2S4D/nWCkC:喓T9v9w REϩaaȿ ׍(3U1Je}[Y=sIo'^&y1g62zvY])3{[r;>Dy4Xx4~oVbuGY9z4ǣaF`<`ޜx+C> D3],?q8ϑӔxOM_G7ه:U%2n#I*wl,;CnEK3p+Ɂ/Ϟ^R@>}q7XwaqRLLŬٷ'}\@n䪗_B2F2C ̉,) g &gre(>=|/aJG*dQ.տϗ?^\̯H_iPۯ^/V@˦ӥHMrṃOEt5Wn$W6\,es VuD jkPf:2W){@А٫sʏ{{zICXsl&5 %r8S)uj- S!Jqe йcrthG5 C"XqR$DiKPʝHZb"TjlLeKA -Ku'ӏBqx;u|g`闞Onկ\.\eP3(Y6=ai fpr =޸"r[ui?k3>?,e轢n9%Ef-GqcqP R&P/J*a Ve&eiv́IWx>mg/Y3q0?X^gTHሔfIE$2 32ZP  @H&ȭxඕP`zo4N~C k ui8Q:K)4l`Ee8IX2n>ZIJi0K2n3f #pX= YB%u2ŭ Wm ̌nH^Z@!ޠ_l#hw0)eBE QrEH9@ *_O!&27vIo3P|bG|d{ qmfed^,{u-ZQEKdK1`# E3BRPYᢠ-sVEj"|80}7xZSqTueOr]7d}7 ,  %CJ>]&^ lz];V_5wjf2;>!s*wd2:QkT -5NhwOW/gQ(hK* W"p|YLN]40TboP) O 2.VxҰ0%1i.mL!VRe;c U*+=tc;ֺlm˿K@ӕɣ[gsHܘkX3f&lziyowCW2R?ޚy*vK=rÉ1T)-bI4a38&ZF4fJ'X_g,ݐ^p02uyr__J %%qQhQO@.K(Z69hA_nꋹy38~q"0F>xf*?h !?޵#ohmU-y[ XhC1_dqRf$I$%RHD3ZJ;ZJˌvzw%.qnD&p`S<ٿ뀁b'\e|-(6m#u܎~[nusIG_X{p&%{31ogC0Wx jI} ̶sg$w,:V8lђq*==A/")'ћEGd{0?o{;[lyy+-_o!{hK} gUJ-7͍֨ѼU1WI8-}TGXkTq┉2SIIk4&be?HZu(jgA,ؐ/3Vq I 'E.%EjV~a.\ :m hiKW(hO[>Ӗevrp;c[w:yy4gHMLrܦKSfhhS:,Üe4U޵>m$/ٽ y?T:s6T|uisM2Iɒ| )$E`43랞nj$۱s 8kٍu[}`A}VӶhnw,S6[Zڧg 3aP9  Cx:xL7RO:4[Pj& ̀ b !_0A^TeŚp)f{NF o+[ Y=/~ƄޖP[ 7Z.`BȺV"`;Z.̴~قe?֋fY^P s0çW-n7i)FN~"c9O O *uw +Bh{c I_,.ۙ NQK\^AUcq8;e5x! ƩV5-O_?~5q ON}0f-gEnlt9/r^HyUnzNwr+PăSIqʰL6#]q<+kA+m;:JrR}Nߺ'\c pL r4(+%j5}Ԉw@Z*/ w7paNF;~ [u'7d a8U8jH߂b ŷ_؎4N#&y5{:}|+Ͻl_LZyU%_1fu{J7Ց[̩M%}.p8i|N*v ƭR->D?[n&/xKہYѕ&3. 񂲹\n5I>g_ҥm&wRlT'vëyX"9gfo^bZwe_߮KW+ҕ}ʾte_/`p%4S.CwKe2twHw$HwJz]%^WIU*u:P $G?`vǑaG&|RǏ]t^b۷~`i.7/9x{k82”R` sah1fkxw< L'ҷCS([IܵON#ff1/zR^( cypH$Јt`Zv8*R2RaHVS!ap) 6fub@gK7q6u{iHUC?pְ,)!a5I:@%E s#qZ& #!W,b Nf@|APQ2Sב HitL*"b@ 3âҌ;.K`@aR+eI ?5m"HK)'*5:D"3؀6\i 3Q$GAz R5\ ."Շ=p?>SUu^tIރ57GCd/*0 I?eo>J02.sr2}Sy?#RVGE}bL&w89)V` $_:"PDN)(ԝ8\xJ4u^ gE-6`M@ZX.*ԓ3rLH]Te+/J~6;V^$+S(j`f_MXlryT3bTl x|]Nۿ74NF5a )2yNT3AϖT}So|xzzf`ψ4A>|,.8?_OS7`60xj,^S*^eS7Ydw > JU<=ٻMcpA68VM\Ht5Le$ú}:W>ٰ<~mT'=78E~}=?.ľ{3J^:9zi+JLvᡇ UC\&פ+j Rjsfȵ1OLL"ˌ4⸺M7]w{徐f! - -$̩Njo1$%ƫغ(Oy\y*EKsÉ9N@A$bZg| 5z6鰶Fӳ?j[;I,F mjSb#].k91VqsT["sl SQ+V`8͖gᣙy7|JP$]muW?P{\L>>bDDsc=0K8bJ!RDFg3\ca6i~/RK,"(l#fbj ߳yuy^Gb*MbQѬ>S|>-KjG-dx;vC;0("d+SڲL=@;xһg]j=Gh>{LljTe>Efg0vҦ˚ɞq#2x@z({u}^dW? -YC/s{c7bg>5ݬ/#'6s/42T}8-'o/9\md }R?_:lCJoC;9_mX6!~MtиlXf]KvG'U2us$'NFKѬUTyBMJW0gݑ`td~}$|%Zy_ʙ)^<5|H/=Q0g E++7O[Ф*o>Сpwd&#QHYu2LwZ ZFLP6MVHKD*ISsϺ cHjRa <* ?s~]`XZʩR ˽)7ncG;FwER3+!:kKNdnMK)'Tmc5f}r}k&Gk$^H1 `o).}Ԗx93Y{)C']J׌42欞-/'-N& (LxN0+D{-"#a:0- 9+a]IXBg `*lu4H&$!e F=rl4 bd$cZadPg($^2 YobA% ʥIj$fXڐe'iuKI;j ہP*ַOc62#&3R=k^d+\e0f6.ny÷B5T]^p3ŗF," tH# be}0?M-a$EV 1KS}0T}v5T:F5efgzg\NcFhm 4wFTaۨQ10NPBjQ_cѫ@IV` '߯NW'f:Tϰ棻ֻ'!_S(lE.C_a;j1-S ((e*b\"gfWOK |ÃvÃI -yI7»: >]6=-Vx$Ϧ%`+tC:~)۳ͯKyto鵋0 ƥ=7e^߮-1fz4#M/X0{WZ9V\٢v9&AX̵Q*g er+gIqwA4eH:?]̌Gƕ8~f܆0(vLOl8p{!1n ZĀ|.6Y-۸4힣]ESznޓվ?_.ck2HKZXљ2% W&E1)1GUG(I|h_Xȓ 9s˭#)XT/_Tu(!5A0ɵ r B($z2f4j|sWL#,8ZI}Gہ9Tlb{9&$;n˿O>_z\*]VӶ6;w ǹFs!.-Yfm OCsKE?}A|L RO{:4[3{;ܨAh%L hbe|ÑtV1c+iYXr{&CD{fhp8%b"XZZ;Ja8LPЉt,kGf߮@1CLkh7 ^k:7C4Sc>ܷjL>}K=$̀ b !_0A^ x5Š"*AVBo/AvW) d1!:e/_]rhξ9}#{]MsӇy {6$J_2HK> d8A1w\SBR>2y4)6IݤHzuW*wM0EN93XjptΪ9U<9]t;9G4JqTQܦ Ȣ`;X+\$JH!sD^EO;/nعl!b jܟ)b)GRQʠS5NA^J /|W uOu_U1V8[^w|En\DNW XEz5}]ﯭ8-٬{v34 갛[u{wf|(Վ7{9sh]p~K9|k+*.oFͥ;Ut`v`! ֲ;\wQJӪo~ \ ^@syR= 9 Ƙcdto₫cgmI `#kY"M᝝~J\ _h 4kY3btjv1Qv*qT*c^p!sm].X)ռ|DN}Wp؍zp-Ԋ7m;gz8\=٩yEܸRl0(E+*^ };hH%<~C9kG-VS4 fDK`^ 4 O܌5;0@-a.`ީlslQ0w,^om5Kbgv'X?d.pq'mj.y -S Ԗ3mm&Elg'z[JBG:ɀqcc.@.sSgvvoRgPlxց 6 ٠'^<w[)N'iBt5mڸN+mٗ[/q2M-k@,]f9"M:(hZ!q@T;-7j݇m /8k7¨LiDgVqv+Y-ŒE-.Kح z][|j8ӳ"<Oauiˢ$OEpEZr/)aOӶ`Zri[[3i#/3xhh;s5Z]qƸ$Bp@ݝ=b\oJZn w5 W5tfpfֲ3O8eFS`|Y=!͇Ƒq3uy΢`weA>(_/ЁA\SO M6yYtN:a4 Ŝ]#n&W BLY\'`^8i0, n8‡ة(:}E0/~O37@у~UxyJR[ ig@(\ߟfrjg"UVʍtQL +Kd68`qi9s UQz$Y]gA%QeUR@M@a~*?*Y9υ#r6fOO ߙ܏^럾J 4$3xwUEؘ}ޏK,i?_{WL7em:eZʖ]1+H@Ѹq Iq_%0eY1T9!SICWT?Eq0ע V$mB\CjşrwlV3VqQ/!Ab9.J!#!@#q-;gq#e0вgOQ0!S&p veZn8T֩t{kX?V7/hi`y&$_VE9N+DkHj\ǻu]?ņNVÛ˚cY܃v:&SRD08rerK1bL 8y"J T jYl ́JWp>]~ 63P^G*#R"%1dNF03,*͸ *"BD KBNu^b0 gE XwAi974]6ptqTPE&A&J9'R1naD.dlZ) jp;J!r($p-Hf@HPI#f Ap-|]k 7Ļ!ڹsbBqD8?8/]-D`ͥ_܄I6XpC $KqKwƅ^%8gCI@XG)[}6W  ̊N[lB\:$UX\rgH]u¢0c,Ρ$Ǣcp|>gw`$S(d_~Wpe"RI|vz `k]0NA +ށ)nTs|-ivVk|]Ulb@ܚOAnޕzۊPm~gs%Cnrڙ_?m4~j" DÂG$g/&6XmuɺYkX8e9߇#Y l_ ˴ 5ҷُ^ 7ǟ_oR{"" ^B/W2pt[FzMx]lV`o{qzwH7߾oo^]]5&^_¯\&E4f ѽ^ GLMMaj讧eF\3Wu`! t)u~oS\slc$\16HRh:[RaC&f8K )/7o=u]`/YP2P/l wW/FLk_ڌvSre~]Dž=ǯB][FЖkm ImVZ7i=ZN;:-+VM fͅnE|,JiʙWp;+^Op Y>ϊl|>O-QHL˫qlO!D!e!x0k5f,`k1hp!-MU&:磈5OƶO#$CRFk.(qCvY@iI(aaR"Z1v :s cT(O';)ħuCNm)+ ^׍њP2zEx#h+8>jK ^Ȍ ^`A ƜTpkGꚯ(&p~I6[AmE,@}4 j-$* %^Fґ0хTM=# \H☁ 3Pe  D96[XG4 iX:hhL+ : DH[XA`Ira#ֱ&hiQZGiIi'Ma-żJySF\ËNuQ"JV{7B!V?~k) |d+{|yqg #z8F$㑅H>0+CʘtcϠkH˻Qt[^:,؁ȳœ0/LX78۳V}^,j(8lZ?ZQύ7Fmlzt<٪VtFʁWJB%$ P&lPލ &}l&o?TBYQTb[I N,sb?)sb?t7 (HΪNC:y-b`k.PHa˜@1gz4‚SwJ3C;,zLZN;vEs0קYfz/=.a`;H8NE19iLL\O<-ڛ=i0%vOl2]!k??6` ֪CsKFDOAs =&'$>ݒ\smSsMbi5F j2>putT*V1cX'L<.SDcK)҂eRh6n{#-g{9_#.}Xo|vw(àw̙"rͭ/:3&en"^f181K1hb<+ÿX8x)bD;mx9.}(Wx<:fp]v#eH>Sb#ݵp^R8A9r^H Jb`ԇ@R6)vFU(vu&sNW\vɟn Zuϔi KW|"+ǽWX[  ?cݾxJP۩7 ?(~pӶ1po,ԉ_DqVٻ6$U 9m8x%$I8!S"i_<fH="3~TuuC/LF7_!]>Fp?>3 XHȆ^9YVg-SPY2g[!@#&z[$l0u\3҄HII6$A~"ú|NSLC9DŽRdޡ*0RƸ%bUց#"Rh ]<3yꀷA%vRкs_0}ݺ pW0'o>tά1.n`|XJ+d.ԗ~jW]ltJmO#U{6>nԬ͈ UBt}c9it蒱`hwӱՏrzÀm8 |?~_n9itn>/ }xȻWow_X5P,pf-k'KžYNt]wkoiu?3.WYCy3OeKɖϱa?0ɥD##vLf\>7z|0ax 5[͑_^fdl\h٘edl9Am/gf#m3'S~DGj҃z7!xYs%nZk<iw&Zax@Ӿs$4p1fY2uRNplI@f\ghawa_F|-˂mYDzRݕνW: [%H_R+iv.D4BlQ~tpGŭɏGK_|~8ʥ]~'?:T+w!X|Q';B8$e'qHy5k%Ɲ}]]%+R]>2 tG[ b"6(q9qD|~ t*ބ |"3"hrAH U_9e3txG2݌w5LL v|Y#i,uQ  \8ϲlMؽd|al}ٵ 3iUtNEsˆ?eTZPn8uBQFҀJfkI=IReHZ:l\>1xgѕ佹C.V.V`gRfi/䦓)7U={D Ⱥ\TZL?*6WQhߛzDehjury w'[$N0O'ku'ruqbBp u2WSŒ>T{աv*7GԺasd]GѶuqQ6Xr/h[sw;Ӓw\eUf:c|}OFK&oxOwmיefT(WDvXM2vpZ~LQ=Á.*-)8:I -Tحx:݉Cs[W Gۆ]^ž-͡fEWixdSt'OGW-|?Xg_}:H4$HS)#Ǜz­U?q| 6Ès.cvi$ e#[w+*AK"B/0V``J<h&,pAr&WZCg"GK(eǨ2RS WqBr9x8Fkpu?XMk}l]Ǒm5fTsׇŎK-+-[P&K." ardVxAAc)J%GNT)2xk +G@Kk$vp;ȵf}A:-U .9` e e^᜸?|ꉐʞbm=TuCa(ۣQEl애ݽr !_HhbcӷSؿB/z/9Ӈ3߷'ko o$ ']l~TEq]ϕ&7yWy]Q)c)?)폯AHO/߽w7.߼{|xoYVbgg ?]۟ѵkJv@פY?Z\?MaC^޸l !i\EvHJ"JF!Ph ˓=lUcf`QE;$%Bo+=i4F!|FzǬS{5$2&"iHJ;ޑr'L2Q%zH^@dѮ5O:K Sm\ LSKB`8yM3Nx"RZ`WICZ|+c#:-"^Q?է9hw#y{O*Tԣ" ̓SR!'W[M:[m]i$#`]]Yr#&PB)5 *Mv!a:`Lti;_[[yu5`{k^v]pyi0DزiGX#H˵dmBD/&X@`Dh M4'.8SJlPW^{)U L` Ѐz</rv/9_D= |_Gύ߹i|}NMap>H${gYmŒyL`wJL: Cr5~Dw-ɐP$hkP )75zf`I B9؝ ;#u\}y8)Kou&Yb,C*uIZj k@(!g[A 8^E6فN1xq p40x+f4nbwwU:M3)3^d+!zz1"1H!u̓ƿ-W|M=|Xbz~zyz Hr0OZ߀,n5je?ogE&zfz/fJlO%5XRGBiZ"< )V D_a Xo,HI,X*3 ;h5+$onDfLv:Հ~8V1Gjz[̀ưjzcV31TbV[M4mR[6Eí& fk~r"fӘS{V37S]*]/n5YSmuJZ ΚsZ䃼INd|ؑA\@9/FyJcqozc]x 5;y:3R30=Wfw6 ua{ OE)v]Һ{wBy(jjwv98\aɻFE{cN"M$ˌeF^*I'm2/, L1Fbasˈ`<0(y Pf0Q)ьx-ҡ"u@ R,G=+b2`z17M4)Rz:+XuMyq,ӗ*̃F+S4֞2-VKi ^c!6 XTK*{Ld4®w]g~)l?y} T`_7nzFk f4\m׉,W˸JsbcE*1ܨ$S#zjan.L ֳ)\Io)Ns`)3(ǜsL!r4zz\q3K;&_XV~C7!upF=6 0]oGWQ^'Y ;XC3E*ʖġHĖuTU3[HJEx:xN\/|94[36;[ PcҤ w`0$Y%TQ #hѣN2X(X꣼#Ȣ !1438&rb% zV ͑U9Qs;7R]2G6`sc$,&nϔcHyOnymo{>!\'`Qs ydLP A8a N 8dT Vx z{ 4Q| &D ^m.}sFQs\M%s!h~ԩ?ʹګs&%s"zw n;? saB<6Ay@]}2҄ň)+$Z@^"%HFqK:yr~^Y^˝a*s6C-#7T¥E(\ "v7Np…ॐ^1G} OA(f0 L)8j.a^[`t; ƍ'8D4=VBz Qp`f+n$0gSDYT: |=ߎul3l;m *oAk[Ƒx9D"t6*ѝ&)u^eO BLe/`4%JdRh"JG֣*N Q&%@ƣkO3TJh3ݑ-sJ[ã@iS.蔕; 4 R 458.jj ;FiDݦ6D58*x5QKǨLd7b&ELF6%"FTE[$tD2iBk%eI$ƬTv44C[$jRK8^30^)p- 7*đQS0q'-QחOD#B <m/޹",P밸a!E{6bƼvӍpkAڬ6mc\hOQ—Em G-k@.!UOnWL{*G_uW j-Gzh`/scc IpM*$ܢgh:j+61EWil9sPC Bt&&,k2ܟP2+sh uQПΧࣅFQ%'U|Dm?ۗ п1*N?~ q~[\?nRf6Υ&?vpMjdT*}/*ُCeIPb\e.uq] [G6IFrAIJhd1S.pK_4*EVI~quIt$:TdoF՜qA ;ObVTf~n zn)ڦԑTF;}?.GqfW 䒻z?l6] =sbĪFm^!"Pp|5)gq ݓp0жDL RM`X_\ST_=6QA|Ove#[5/Nʲ3@JQ"Gh I ,: 4=L\hD)c$pPH8Nj"0 7y)ƖeϏ&ux_wl}W9ݮJ"͹o5CszWjϗ6=5Yk^b;;|gCnJC+P¼aQ9]8NQcID*4$J3  sIp9^ ߻+r-Yݻ)܂gJFɃLU5 I" %"Z(bqϕNJqb?e?;9n aQi\L" I3͕'O:PY[?}rkmMuQM.Uqkb>%w 8=+U 5+Hu_.4dBir%z3p3c%}L MdUЎ\veUu_f1aDM6M|`O, );SJ{*[Tۓq |,UNpF59En8[rv$3[*3$Z@J8j=!HtByt\\@g HE?8bi}[[) OWiG>.Մ)MrJי.r:G գ wW\Pj\Y},x?V,-w׽ӖG9܍rmF^XL VtS7 H41Q`X{SBsvouP;+#g>dݬj|2.ĩ_C+QN3OV XijX?v~?tKO4dxv?y+L0h0閝i'u*k5^|.|O??ot~w~e}M[NKN }36MMM5o3o\3_|߄Jh/u~RM_kmCV^lN:@pz8߯)/ƾM[PmEx.BIȭPW}W:n3S:ʼn|0ѱ6F* a,7#R)!ְsd25^;2ՄE%"<*ļ! Ι.3gN2\P Lk ͫEC`o\Hc<![E?]=./=:곳xV+0`oux+$'_,6f;Ҹ\ TKJuqZW_Z!.*`XI|D ι+YRmB@UF4mxsw߭gRvX;Ͷ5 dJIyHYwEY4woQ֞z9c@vs \pEdCv}/}ijO[5h8LapQ4NScp@ :g(@ U0*QY-p_KXo~o,,ؒ@a9Jd|Is^]y#8diŸ@t"7ǵi\菆U LMDx͎*7!7FAS)UhTΛ+r$J!$%^)Cي^{-THƱ`c)U1H*TM([ZzF]YE]Ԁ_jmb+J?P^o-Gw<0G1β^Ry NBP-mc"^&tAv}CT;-( L͔BYϣBa x"^^s.$GdMHK+c!3" @85I*ʃ ༆(Gs*ŭЄ8-[Ik%}JIKbCqJEmU/]IGzՁN67ou?->K:_ F2CD[ Y=$oeF(5A!k_q5JM$)+"@*պ pQIETaM%^ڔeN?cm'issfh}د;v4 >n|DG8/4> P XlMS2+']Sk 8v}4l=×zޘ·(7ŠWWjԺJ WfE1)0&g=aI^5/ExHd'p™"1O]SQMOxX'ByS6JP %&'54rg2x`Oѧ%C=iw3\ϲfzpX4L6[w ǩcmr]/& 3[HJEx >d`-NYWְs&M#!*P QG L\F:hc7Zxo gcj9ci\LSZt,M{ϔcP'7߼dF^( )fp4hWjS!NU#Fwނ"Yyg/|uҋ:;6z'3fVMT 4ܤC0 ` &BB8B0 .㼈Yl0#|䱯GQbV£d=i=JυՈ 56zi#tz䱵4tqGC+K"~{uet7>##]m`HJz(W[$j5-JT^m{xe8wd8W[\E#dpѐE . n/w7~vxO.GC7L1໗+xZ\]_ڏ\gs}d95P:G`tz6RָM)_k"O|ohMv0/ȋY _0#BXYL^jʈhA #(H8Hwvi!vg9 v9*4'];nJ]6= 2t\ ̜.rf/fkeI!7i( ;+HZ&т)JS$seig|@2`fܝ`.xYB[xǂ♅60e/ݘZXzݫm'mzԖڵܷ lnۅK2]7QY퓖[gM4b_+\^d4A=z?__ۺcg0̺I eBQb ocJTlR>HܬY)Ѽӭ4Y'"NTkRrt|-(/Ɗ oGb@.-[Nwp'[Csn-x9eLH#Tv@I>E}Ua)\ b= G$go/Zz1Oo@ vVx=[XV Ip( c9LRH$P0ͤ[v8* [9 :u+%6!`BL1jveZ&o8z<9kf|o3$u.(jдsU|x%)ӫ]7.kg{l.׹NIAœ!n #!W,b NfV$0à &r܄а;"E RrHd 3âҌ;P  B' L.Ȱ=VK= P$JXڵrn@imDh4L L>?)q$FXe On.ssmMAj4%AF]d 9+ma`&j,pT =΂ԭ Տ|oxH%""E Fvf{YEcHqDY={*B$K0퍾UB Mc`ָ( Ҹ]~ȥySƞ"ȟ/yȣ[p&SkJ #H)D`á]S)($\d J( tQkVÀP& -,I&W3gX;)ӓEa8Mc,~Ώ#0|-#uwiJER槺R{w<<TtBD* ]GnV0dqt8Uͦ6\^q[HܚR%0{Μֽy 3eͧwWӳyb3b.uӳjm]۹@iYÙwBm#:t6 ia3Q`*|44 =џ.c|fmUq+$tIs`.u}5{)"~poT><ة]ﻙ)OW ĥƛMO[]B?mE_Uy]k4[2Ujag $'NoN &_?|2 .n!"1 nDd ?=bh0^34UlI|q;r˸[WB Q`C<h n|ycbً6[2Q=*;Mʰ9yl'P_𓶃o"``exPTUiRKMm{L~. w۩:X  "$r VJ#Qj<"Ev;:h¢?1BExrL$N#&,Ӕ GF8C^"J9d^yP-CNڒþc#ĝZ3,"Q?˕׷$f=ڊEy9 xk¤S6`ηM:00-&]$_Q+`R[0EEJ[5l d.2{fuWu?|CX!l a._. vaf-.--EBvni7W_fPmOƺ+˳!U4,ZZN|q  J\V8)iţ_EX>< f#V['/)j6#wsͨj`m&Aw3*-N䢤L Qz68J9+7;\)7*گe0CRWU^׃U\f!pcft5 v~9W[1RW`F]TՏJ*QIUVWP]jTSޛ0a lGPYQW æ֟&C4M8IcԼ)06\/UNm&K>:hֹ6kxiYZ5l¦Xi]Bsc̱b:č8 c±mmƼKE$"_hY"M1/♘|2f>(/EǸ.N8݈{@{'|XX0tad٣y >C;uj9*RW@0#`U"CQWZ]]%*+. +>uj]]q1Y]Bu%$Ϻop\u(*QLT<:Օ`s&{/?~oH"@y&@EC  .,J[dWҤcX0PK$'k`& uB v:k( &hl="U룿G,Ս<盡K6˰cy]➏ w7Ub5.}ڿ}L #傄~%j3*ӍةwYO@cAALb+?[ܱlFF2e@ fhVk|K1CgoP f$wx-SK:Ox?=Sx{JҐXs\x `burgq#Qj" )%#ap))qZFeZj^jFeF٦e`ЌΏ" 4+A`)J5 X=ĚjҸyDzУ;rbJG e1x&P b,W-S18YdA <uxPmb4@|V: DJNcR 3â`jkvQLXn}KOƣaTСusBoC*m@"F`A&J砟JYnVK dwRiF26HN&jtEf6\i ڃTXqHyXE>Å{>bhg߃<.-g+$;LW 7;1yD.¸~F$IqWQbƸSe=hzUӣV,UT; +0s P8tv!3SUUcg(85iatHz1< sj?S03`,~߆ߔ_H2*|-nL-H]}7鯖Doߞo\ rgnV0GKq&Y륍[B䞐 kjZO|_|;lg+&`&sm{]\Nꇽs ~a/`0xj֑_=u0}Z'> L4p:auonqJQlu ͬGaFHu8+zV b< iN6`fߊoz7)~'W,[TQ"/O أzӫWtmfٚ ,7.\O?ÏO?~D}z>~tsbZTi ix=x4 C\Cxӡb[ Mmͺq4^wkC<glec$͒v6~ %ؠs^RbQcEf]Lz7WNbwuu{y'm"صye2 59;Dآ&EߛOo0=@?bT F29J.,iuD0^$5 Bjevpj{zh8[U''{K=$-4$!Q3N>zs}>[Z*Ih`0$IQXSD< HB9ض jH\l̢zW|i~z*Uɦh-R܀LS>RLSNq飶ě #ADaA Ɯr aHu; Yw݅.&tA!-`eoekKϛfww 1 J"b%dģR{ɔB$hHWaRk {5r%#57(a 1."1(nX0 .jAF`ÊbK>:ͦz3ށ\ wGOV{Ltʧg 3-֪CKK)FDOAs:xJ2%t}΀Jbi5F (VY.0Bi3満,+ G)>R96D bЦ39u2ƁY0A^ x5Š"*Ac,w5!-1ZM^. w- 6-؅sCY{b[s̺ ,6J_0mn9@r-# dRKJ᠌᝝:&Vq[t1_!C%:{Gk1?#95֪RRi$12Pa]DHGR q<ץr1:ђYWH(SPXQQBOZyJυՈ}1(F9DŽRn&x9)VGEdTw9lVJHEnlv ].HWR,Ų stJ,u_ CࣃXs)jkrQI)%3ZX#yEB_̷Α9򶾼:\tщm(ÄdGZ׷}x}>V?_gMV7Fٻ6dUg#}766 ,X!)"-/w9("iKLWU-,"Dr- '>#%Nt.:J*-̾fqPԤi1kn׷MsL!tڮ=PKRp-yNve&#QHYu2LwZ ~ZFL&Z-%zKgSy6?}v~ &% ^J Ŗ%I#X  CeJKy0C`0$AZX,@(vGE3铇AvYI( aaވDʵ9E4\*iuՌR<$lO=`2-o ;u''y84v2|Ss\vCRU^:nd>ZR )Q[M H#,3A6Voe)k $l8[Q,6*ߎ]Tl r?{I(E# 7 j-$* %Ѱ*GJGt`ZD"rRNt7$a iU3ذb ؓvZB`# {&E-׈Vw6 #z!U@d<)Viʈ` Xy$J5ܐnyg5r >Y7r VŨ]]΋hp>77˲ZEuۇ۵IX%u+(>XZ&GBeKQaʙ" ͢U vh5Z`(3*h2:ڀmT hpiY[#g/Y߀Il뺻w\fP.)J ٮ,Ylpb' QG k%,[.p3uvHv$g;r($Nh HOU/Bù=T3\l`G-wf(}7{5qz+f~.Ϥ;3zR[n"_&w]Sbb^P-[]M'7{)7[Keܤy4M-s9Xꮡ^-h*+}Ep7x\>\ f'9˺:ٱ`uBBqd/_Xzիe'-hIm6ja limC%=n&'-74"=7[lǑTR˂}~3:2MLr$Tdޡ`m5, KU LRu>0AU8ʴ3hު2.DG7Kjy]4X'"@'z,YFQ,_x g(ݎ%I @.;a(B-J@XswC1Zg %Xjw+U|\Kھx ^W6/R.~Gs~|?s BL9 "pzMbs)Hɧ,ν![Md0ʣ xSqRH(ETJ9\pT)ƵwDD(ʐBBp$\؄ A2 ǨU1kQRk R5e9kO|oN`>Qs9,E!T6ju{ur!^iJ;^'Qr04{=X|]s:\IAœ!n, &P bcq*g5,hB3r׿s >>:M6wD@4:&(fEwZ$" (O XmqcPa(ž|uWЮusJo'*G"FIDQ)qX)7LbU;au\K:dlZ)8|yJ H^`Õz0?YpT =]AzwxI8_ѩNX {ϩE2}BBd?ٛ^ߗ"{/yD.$~NJG!QM+5cja|m\4`ٿG].A|A޻QG5=cX1SC9FRP;6qq뼺WVÀP&z<\:$UXLf O0!=w^e牋`X|CL6 Op]BT3{VUWhӓ+DBޱ[ Y8:f*6 ZqҝT9L~=S:+'7#Y~/-n\V^͂Sb̗ {R}73"`/7K3Nn? BM#5<⑮!08,Aa0,t`ŴzwczeX-~ȦQUq'$㴹jIs` G.u}3?8 ;wXj[J?d|y9{cvo}'\wl^Z|d0GUaC$?x]vlɣ 3׻!??7|oޟc_/a- Ipt 'OC m CSՄ3n0]B(M~7i7n]۴,KmhcaCI&eX<`l`'ic, *0wHTL}ɏ_<^(oNRg`3DH4̰6F|i<"Eg=wքy!_Cq``"q1aRb* 1zg(yA(mB W_ģ)UmM{TnEyxcCxM[Ud K>dG7§DlZpT<;*éR_[3~a*zpp5>`N׽RYl@O$W?ӨU[:iTj|#& eQ,@rV,ș۝9:-^T1w$7`k ,x;ĬW逰 OjRz\TknXe7>dBݳn0ip2M YЧ>_A6v Cago$ ;ob4cq0}vAܤ&R-~̀ۮMbi5F (VY.0@'i3A$8ӹ)nj1We6g\k|1˄D c 1hb<+ o wbf1`T"zd4ѐk%j_={ϡv2nϾ!T!ֺ}suӇ'˗V`8(691r4)Ĝ`Tyyj婟d튖SMxmScK2cNH@8c=$f,$Dʷe&#QHY!x0k5@j^(chne|9kf)uaGKLF Xe3Jvu%2~k x5:\Q,PT)Xy,3aj]u zRi{)]YbȥEuI@hu>j| d[^ף.ZeWq6X9.=5n`NL0W]]\5x,y%_]ѱ7 6kmok橗=fuu6^ZJf/{}8#7ZvH;ͩ@mBzDd>K?-^rÜ9Q [cI"!:{ v1ylz*)1&+d5қTzJ+c2¸DK#FFĤsL(QmGs"~udXDSXGk"D hl{Ӣ5rVZh,S8˴>xUi+6n!}vG(AM4ߌ\w{|3j5A󍨤AsAs4J+H]%>ujt0*QD%RRW@"`U"CQWZ]]%*wfRW`F] CQWZ*QITCuŴP萌D0#`UV*Q;t=+ `՞۶w;TWNKUᦢMLSy!ᷟL~:'MّgyT. g9sJJ GLqH(SCܒ`,PCKgF-9liGA. =i f.+o.UWNB)2 /"K_-Zs:t |qG)_ȝ׽wmFqxiv-^t2ȒWx;<:eYG-ʒ pr8//M]֎}Y,Y()p)Nvq?4[d{-|c?F+/8a >̴聅>9)]BHC>j8[SVsx,(%Nkd2B\iQH/#rFwN&Aq$jRYuR^B+ϟVi10TdZELFU $X.~,+`(R$6܁pVh@$BW*1SQBNϟY9 $Fh)9}epb)#q)KPg!*g ? (D1(MS"$^pM1BM⌜Lg"|N&3L3 :r8La 8ى_´- %I6s+D.*(rD2!twQJXuuHͫJҥ޹*z rA~ 'yi}b]:΅ -!yʶl}}zeu˶A04ЏG- mo,󃼀dd&CZ|7"(τKMFKZaҿ3O!eZys9v>Ca !* h q9MTsj)G((, 3Qd,\c +ɈU*Tdd8,i)⠁F3.kTyĄ™+]Er\g/Kx5),Zs*r'\ 1C2>L2%>?Y(j"7eTF$XT)C#uLֳ DQWR)\B#U2#ge܌Rb IKm!/l Yg j % *;OBSx=g *7>W`$H3< 0f.2}v+JD,bĈR RD6%|͖v:;5a.x#@wa2)`9M$ 1HzP+aO/;t+1k(!J I! ZG }t0! gBsr%t@ӴN:M;@) VF93)* 9R8"W@øpBSBM/wq1FR:,R@^ ck tq1]YS-g{iT AU΢QQ눐*H'$ BEK$=may 5Z$ qj|Y.JN0p5@s§-1Td]nƣ/T@3onӮ'[5{ ] _!Vvz5kT5ar띗*OmHI!sSKPQQdb)mqYD RڬjpT&o ,ZGp Xq6y>LGF* kp1rkZqZκ~uE69}Wtw~zC\,WK5Ֆ77U8]jӆaV8'SBHn-W:%G_{Դ͚Sޮɏwӫl 16䂘k?LՌ>viEmU ;AHH$ۆa:̲|@E06 9Wpp?gauirBQlu%PHXm4 گxofk; a~?,,N-~nOBf[GxKQaGlǼ? %? UVlBIlq/oPIO>_?w?|H?~2E I^}xd迿CRCx; ͊κa.r˸[WA~o 1!^Wey#=Fϋv{D;b QJ=A;p(1m@} G8@`@Hb VFb ڲ?n֏H?S:I(BYD  a,9-ڧL n]\%Ꜻ[ªVOqV 圹RSa|e^E4-uPEN*\L(\Udꢿg/lExn xkxʤsx ؅'<B[_ʢ4^mX\#l"US [*Fuׄ^NJ=;lhUd^(56%S?L!/P7@ :-Qy "$j<\\ h<8 Co0<>_7vy=B%.zLgOa 8ɴXm`Ȃͽ .}wX<ުi/Ӵ7 .7A=!ho7ҳ:y1ÆfQnʙTOmZ 5i^V(*i?_bםZa@r+ppZl(FZ7f5C+#=B .'uYGV@d@l奷Ub2e^e!|;9$|*XgBL%!:4HSg)po:"h%TIn<O%ڦUȗ&I?#!e"_k!3?=/|U}!K;3Q-#V )gz^ʛVw$)8P_:a*W3W)JƟ!c >&1 hdCWf=&CVIKN @$,H M:)"t $B8E>`ίyꗅq~H؞qko_Mا gۤ¡"Rc _(O`ӈi$j SbT%Q} M`Znz ic}/}v͑}|U&ӡR! R`eŹ7sSS.Ov,Z;w Tr˼KuF8Id M(IZB@D%nbQ|bEg(CNb#u;˝^ b>'_.u8WBZ.8qh7+aNo=zC[" =U ذYftݬ»w=}[ AYQL]u2ڞO\ ?6ȭYBv_'e*vM}Cs=/,hCۋ4ṛG۽$?qtp=Kkir-Ds>y]޵ZJy/Ss>O2#zuȮ-!M^_LL*7@*G-VU^ G|gIի@8qO/9BgY{I+%~ SowtfF3Zn[(ش0YU6\`2tf#ޑ;T#Hj+=ZԶHPbx=4ijgNzƈVɐ@ s,.m7נagmI 9Zr;Afx}Z(Z6^}*l,ni (+4 iDV6$eULɓV*q$`s+y(-ui%v j5+*RXD sojDSAjlmGBw(6v"OV_w/[Ǻ: ٦1!ܸy+a'ómz ؿ^7ʺ]^͹Vv < jpnXz% ^JZG,.cd3lABeJKy0׮ I @]q9Nyar(6i)ײ8Oh85/pۈ/rN<QI.۫J;UPUO:J)1@xnI (!F8ZRG1s1WRFd>Z]w =)Q[M 0HU,3HM[iZXҖi!hZi4eZz`p~ѼUo^1)zp !D%$ItY$%Pґ`ZD"rRi )r-Tӱ-TvT` U iU3P^Ah \Gu*] ؁L#ȁ1mبw$XcwqL+ : Pd)QS, A9ja Cc5o:L0ôubNcbEWWDDsP$hΈGMj Nl^{>Eյy+tsss9>۬H&z;n^Sy+M)_^=.k]1obݠ5bO\Gvmb[R꘨]b}s@v\XvӮ4ìmH'U܄^S*:w%Ea}ے+NiX0PK$'k`&ܥn(&Q묡$ jφY,0Eb[ v6v=%fbӪǃi )aujP{dC'g>GWeV/oa) b=!SIl__ V $"p("hD J1u(¹c ('+:mqkBuťCRj\ sǥ]vprNQ/yh|[i;8 7م) IJGOu%%:<<_>aX!"JB`I>T4(|^qҝoE0%3uV5OfFX~|,_ gsaO97׽ANNvmg Ep\&B>8ywjCi$BGb|HMÐajf|Da0,i`Zɇd1{ccqTu2ɦQ窉E"9G?#,~,Y>ΆanvK1//o}p;|36d%++Jx3?Jk4:DO>5T=2U'jA:sS@~~{O|<~1&x`_o"2^e  ӻ_1nkho24UlI|q[rø7|;_wK@XNh H޸Ž-Hw‡lIʇ1f %:Tv8F@{M" `exPQCBKM~l4֏Lqpxwb|JϜlVH!HcV)c(R֑ %KTk-$[F &iJq \JL#V#P![/ lWH"'Vv&T5o˺<ouo)j [m&ފ2 7xqZ*!I/ G>@JBk {**59|4KуGf%kZuWJ΃TXjR^h'~1&sa Sтҳ鷖 |2sUڂyBހ1T)b+t@HmjRz\TknXe}Iﺣa*)Eӈi5"}_0>v%yuJ}i?~zIW{l06hPL36/{=k?Z-}k0R!lfb!ŐKI$-@b) I4s-c1&N(S<`s-:ʠ:뻒ڙ.Z&ĵ r B($@b1@1gz4‚SwV;Yy dOu]O0me𝨷!QO"D;r0i~5mA 3 *z|B,[JL.0" NkI[Uj264 qTZ()X8gpd:p*6Bb̘R洒2hhM7us-f$Cx9nt][oG+_62Rwu@N6 ؗ5T\K"RO"J)JI5lYh믺q MƋ QJ?J_*!devfWA'$Ȋjꍷ㝳vϟu0TZJ=k0xss^ɾ7,3S6WՏ9Tbl4Ry1ipB$ V?JYM2!Dme9&DL0qQy ZP"PJI0Hېum2ZDI)tFR|$KX 2)tޕ87Sp!;)I˰sbkqyt4REV Vj,EPkǖ\t \IxԆ)#e Qۙnhd ϶YvI'ir%;@_;紉J)+!3.v/p9(绵 JP/$+2I6V^Z/ C#^a3qLeCmPڗlGryC%=#J<+CRdppWѽ7ǯE@N?GMI(L 6H((Bc)|蝤V~R']I Ղ+PJ ]=ޚqÅ$lf B9;T pTAt+)2mlDsXXQ:9$IHA*zlǭ'b|]wl|^[(JGMRh! 8(*[)" MT,(Q ™<:4%k|g4ց[+k٤.)ᚾ|I^l}iD?Y$&Ol@Lwxv-SiJ K+4X 4 SMG?U4 6,TfKOIb}&vGaEy0섗\"Ѩ5Aէ`͟0/u{5|y-yJ@oɴϤ GPEX`'$ιBF+T:3qn'B;%6>x3a]< 7]^ssloW6|e3oiz7|ޡb;\YN [|9♷l3qfa78:Vx+O!c_\mȚ&󃄋kY4t[!úw6eɑst^Ok l@M*}UjGU)G}5^Z>=\J`'V⮞R? \FZJJ@[poWzF\IR+, \kእԈ=\=Ct\,슥=\=CRYe̯X\~_R:\)#0=\=CB'vW7pg:\R{Wqr**W,-pU|pe(j/@?'U\؛*]*s++rxnzR<}RAcQ͇_d,"{46XjhsFA2b2R6I'0\AG F9:/E&l"ߋL24N|~O0BF8jg1ixf/c^n7߽cf_ˌ{czN=t @D?VBq-WOblR ;EPل3+BS}mFfI%J8r^ToK8p~}Zt.&dOJ!]Ov@{miC w3-1@OEA ^9Aϫ )l!ukHwx;T޲{z$?{^?pډYx6<ʳQ]wDt0nSSb$|S1j ۗuf2ϋ/\;υ/xj9yU<%3YӀ'u_cg G 88QJt&:G)f3f I7Q$r)n> 2oѨҼYo)`Dڄt>8 -ggwew_<\J':q8=!Ѵ8@壝ҟgV%N^Pv}3Lk5ns/?Mj]# 8|*1+Ej*޽zF=(ĺ֎ AvUey͆i4n1gb]ǣDO.q 6xu}V?jGz:f4[Mʠ8r0|b9Kioԛg4/Z´jE ,a_aXģ<ȋ_OgX oΥQ kf8>{6?꿯x_?~+}Wo_2TA uoy/}~-] -V:}o3nf\0 _4 -ŌKؽ -CNvKqg:t)QwTе5/;윾eٹ[[+e݈x&F3z졋+9|R*6i|ȅPS ھZ|GMqyi[ߌ[3nx=Ⱥgk;*ίBq m_+dmn躳^j?b3'bZVIܧU`?yqU\7yq,-~M*>/9ն{ݐvN[NaB&Om8Z_8k#ɢKm84+jpKafl=̆9y:Ozݭښٿ+sx;OisL(M^d JF%ePim)CT&B3 m* Etqe_ALƂ9&.u|ٙ87(;mJW©fgzrSԋ^W=;E V*m6H =ڟcV%%1Z%ruْsQ8xp( =\r#@ٱ%˶* Q( R]#cg܌Yʓ`agqW,Tc!XXxEn^/-zN'iz4Y? pt4t4Lrv^{'20@e+dXfqӞL0;1KjdeSVNi_ T¦26D=|F$*"Y1bw&͈ǣ\ jwڪGnV66Dn_=2q)BEgة&Bxؙ87a$a ]Qw#b]wRcVҨ/lJcrH\5 = ص_sض[F ^ɔ[ڪ<=ab}m w"{}W'0@3ofasL+?ρǣ"ʜJ @AȢx.9GeP3P1SwnJ|/} 1jgg%g"HjS0bI ZHT?uQeS4uA*J)&~^!""0Iu&ΖaX1tݕ[G0C^ONB|6s\7CwrΦg׼Ǧ'kRNR^(%+Ptz^c-XTkK2a:. &ٻ6$W=.;Cݍ6;3/ 4yDʄeRKR1>"EQ*"bQUɬ8 guqO&0QCJ[$zr<g#0;[bibagy"ϒsJ{t]:ԏܥSSGɏd#Ygdqր{zZpi0Sw^jt>j9;rY^2^0{<`yֽՇ V//,{ayaja]W M6eRKN  " Y:.t3xgZKZnx鞒ۦӓ昂j}q>$e4ȏ7WtzdoX)r#И# 5deqhe·c  j&kҲ@ͨ%e9 o3cJSyv7}w2q,GSeM񜘜BBoKmƹ0- :N|O7߯y 9qujP{-j"j޻аz Q+B2"Vs}muNK'-xJS- {L圗"1(*" '>{nArR#TRx(IArR P`=4cq ʍO(YjRwi/Ä|L΄C6e{W~K3hlnlJd[QOOJy>s!Mx, ,E&qQrJ,D!P H/{-&BO&rAΦ\XC+@|m jP8הE0x& @Ltزrq,[+i’`ji1oP[^`9+FΎrVՏW f5ǐ,TD] jLE&qK<aA 1SқAO@{:RQa* Hz' OvOYsR'0\1ޚ28$ҨDE0Nu8(N㉽%7%#z(bi;/d!d+ DOi@nJAIa(1 A 9.:Vv;*GFw]1Wla᳍TxQY HB3 UuHkC$ %1cj[CAJ o#4{& Sǥ(5a'>D&ZQK&My1s0ļpm*pQb_|Mvp)xx `G'rwtF+ o0jaQ0*LjrQSGSjZjKehwMp^ ᅌ  7e]z3ߡ^ _c6ddB% B( WT k'ATYtm*!+KDn1[R?eAYf[y9TRh GcBM܃ /xk"&DH6Y驎7)Zm~rZ\οX[6_?h0qE\+3;ѶhueT5F{^< Wm})";Xs0f(;!Fy .0nك`f?Flh ҄sZEe$]m(vq=K\I%hpvvŠ >lTh2+5,`e-KMHP0iY]{aP"[ymL`z|Y_w >3٣2tpԁZZJz ,mPSR$*ɭ52p1=r5cVQۤ(@m|zppWE欲5˶v=; ˽:{)ZM~/ I 턩8^y\(a=;4$"P>]V&o6Ӵqlpo^x{صOY0f8?uw)6fvԍ}ͬ5onmFF2uՍ@h{f?!eӢMP]Gl[ߢ|ԞhڮNiy|?D+#7d<0R Q-[>k-G_ƣNjYy K\b讙vUYmoΧ+JgS̗_oCW=7?o܌Cw5%2 j<\%DD5\HQK{!C7΅MiN 1j tUksݓ坯ݓkXC+.#F^% wPXV&HY<g5w'I`2HgM,E*Q*8j)j3+z_9-I}Ŵ&̷i0G~ >zGv}n=l=gL٩>\N?Z!O?J)gxĪzjL>uɥ\UL%׽zIjV+ng>QoK(6&]{\9Ĭ8$`<=:p+Uߢe␗ܼQ.^#| #?Q>/XLB֍8`yxsj>/Z1~~p/?^'Rq}%Uu1UK10!A;4w;]dO9.Rcš3h`Nx4\Nţ u&SHѼA[A4;#uSBF]er8ue*S)zuՕXyF Fr6*+침Lb5|&d\V9y_9t%BWFT$@'d/]I_rt%;/JҞΕ)OARSD"EEćs %ޥbƒ*AEAr9!z@ Iͱ$ ,# !W893LRطڗaL0ThSv+z2OsbN|iy!P.>Ham/(D4n iBC ,E&qQrJ,D!P H/{-&UK̊Hp/gDl0ʨW#(Uk"e<[ei &:nlY9s$9g Ԉ\NYܦ2Z5{ XO-HA<%2 K.PQ E`DX;IV]~R'-,}Xx6|)Q8sA"JT$,T@Z/ ϒ>q1R_nڲ-`7wގw1QKSDirojg2=)%Ā%p:K[/"Hܢ7z+" -,|CyAg1P+$P" U(T1<׉` %u'dOpĨ^< -Ug7GYgmRM>/}}^EVy@ElNyQcr Da6@gt}Z ܷ%@~G˸^^\X1EKyTÿٻ6r$U#@pff/LaXYe[-Yݒ,S3@XMEvUWd=*'ϝx쿞lIVřbINXQp~pBq3\Mҝ'M|}dGyt p$*8ʝ7\ ɘ8P=d:FY^1Tk%i恣9;(@ ź쮢ƺ2n3 i%)FHa9 #ӁS@(c\V_,{&CD5)RR%nbi2 Q .,AT?;b^=l$41 GW@{o|ܧ563 k%k 41[P`KŠ"*AomAviC:B9(๮ޖP8FQ_LԕqC~洨RbK*#C!Us V &l,cƫT!E;0m0^ #)70VZƌKL:bdDL^n9Z萪oo9j̋5X)0naϞٵ) T3Q^_"qєيр*4((sv!xrHV-Wl'%? Tkb ^NE*rmSc ǔiuЌt~D*o x$) S띱 IDk1hFs+R 09`߶FŨ~Rc<u5kFue-=ߠmmGzΛLkC >"l5]].&MpY)+IGKhb? S_T}d_кSjzuzM63\ͦChYA˪ͻ]zmeZhծzsބGøY3k:&ww砸0s֜MtHk6s߮'".*l#wvHVVl)vE'[h/40 67 (DJ(zYsϾܮ"o%[iT{T (&,$r<V⃯$K Xg,~UdU0y+8bt%5w冶 m)dDb({ZyOGjsaS56zi#,>z% =&W]UrA]\0<F>E'bl3K@nLIr$gT)AJ{!SI,唳ѷrʋł}@'zQk΃"!@#]l8Ÿp}}*(L[kBXG"\Jl&y$,ViǬQFYA@[/S<]k"91 떼ϯ;^rjK|WPG e1x&P b,W-S18YdA52_ur:Qxf3/)9IEdTWafXT,CP VA9  Xoqdžq?a&wx4&/- N6 N J#E9Occ0VniA26HX"LE!r$v/J[= 9B% "HyiJ^ᒫ|Z^ G Fv~5EaDqzeG-DO`ͥt"9Lzuha@30kܗ( C}a\˛tS5IsA|COȣ=DES1οtWNc N!E|RL݉pn7'I}6W  ΊN[lB\:$Ux\o`A-p( '|ŧ<2> pӻ0"iW Vro޼_,U`X!"z='nV0{SqOm 0>R 04uo<3¯q'.'gsbugӵvmՆ\N/gf8_AeuDHH? GLvpp3[vyjruJQgs[!eFH}4vWFێ; k[J?~~7?kRKJ6g(VO[QbWzF)_\=Rي N?>ɟ~x:}?O^&%(O&'Ͷ =!-;Nmkt7&4Y69 U4U,ZZN|q:"LAY/;rgwfGP@q3su?6=e~La RocQ>Jn.ɇ/?')G/x{DŽbKqĒb:x2F<B I Q< ̂HB9؞ %R.6r֋h:`B=:RKdGUj&V^@:J)1xnI (!F~gF-9HQ(" lG S\-&xh$*`̙rkl׌Q^ta6V̺]S]Tar;*NnldPw/AeA|[i$! ^" H$8:#ߘc7.$Y\2U42XdOP& fhJ`!#.Dʝq(95 qy(Z;w֦EkJjJ$ >p3Χ_k@np+@V#n&#˪#88f`!2 hGvc `=p )#ȁQ1a6rƨBOP4b6V#X4JVp34 K̪OGPA+'JjPra<FHHqb b%,etz ",iP.E$&`Mr;:g#gFs_:qɶzQd֋ŢZemC+\\NΦm\?'S?)X%u+(>XZ&GBerE)g*Y%ịUh4BkM̨3ʤF刊`)'>wFz ~g}rǮl*gG+_*abjavŲܓe W(e9V F"weEY+\4br+E@ؑ;#ɁۑGG'%qMK7ϟBs1aԸO|ohL@o9S3|{y_G}3cAv& j0.&vM/nITt~\[޵! ̂.+rS Kp3"ا\LDX,-, 0l!7i/rW@;$4 jal:[ý5-Vw"xZc>ëj<[<lw$f ˺Hh\޳`xfXY2DSKo{vVGmas߮'v:.t~GK-wHZxtm t[)ARRIS|iWiJ_05pَIh+BCa6V%چ;& Et,Pj={ GwmfR)iaf@Ѽ*-l։褠mWr=4,7*(ˍ7I zְ`HNHYOL tn(&Q묡$ M~ggl"XT}.o{ټu~j;Bz]\s ]aXiZөSg6z~n4 DlڼWϷR}_xa)/|4y_G h_~K/iO?7nF9qm1A#RW@0GhU"cQWZ&]]%*-+B"U"XQW\E]CWWJzJ1Ju.q%]w]GKh.qEnT.q;]"E0Ku.BQKu.qE˕w]Ku.q޵q$M7R?_x/;`kP"R~䯿)eH,szw߻QTﺿw߻]{u]Gw-RWO_|'H#&E9׹ɐ,z+sZse'h8a*Eh$JA9gr)F(0˪BpC5#r!X`h@"o R* r|HKPW#g%Mk~ D %ؐiN.#<?o}sWּ: b"G[ŴB:- `$Wx>hbJY]#3Ea4 cr%B IzR 07;{A}e(bcw֧V`O-۰ JyRS@%lΔdc a`aѴ}J}["(3n{~Ǟ?iF|@uKt4%'ìb=IYW@N&^ NaZ{x%)2i+?`?-#.9#,mąARjF!oJDgɇ,f)<(c~֗la0.5NZ) ~bNc98jc%KWSTxi+ZiWf8+nh9U&Np=;뷯^~ë/_~x\ط/^- #5$H&L?D_|]iC/6Vq~m\M r4*  *q۟c#]EKH j&9=Ծ(~wc v7nB K9']?w?M&g#7۩3EL !XD "{" r6 ©>INN玎,4aӊ!<0 ,c8). :9NO! 6Җ{HXBU9ګjsJ|ucrF &Y# xSnJ­τ)L_\Ã!fUw>ѭ]6PJ{0C(_ d^҃ٮ?FJ_ Xm жkS!-8R8EK jsT=/ž:v޿4%?x]gd([gpݿWT ͚Y=?zl'I~YKgW?hVA`wmMbS)Su^1!K,9ԤlQbd2@5~Ӌ=>j|La:m;m p*dA4lQj)bwՇV:lpH |x$y?mb=׶mz_N>̾;&O4Q,,rG-!EiL2'sNz6Aj]?\4T?HvCc$& Ƀ3.g 6A*B\5%Gk"l%x)ѠG@rcIkR0X99EOJE3 ñ/ӾBTC3&MO\lդ\ݾ˜Oڸȼ孯X}S[g`5Un>?DU,Hsu.^$ (a dgd" E"elUͨ)UMYrAV]c$ސr-= |&%U[3V#gf<.BYY^>z9e!E9_SL{ؖtݜ >4tw_hh40Ng_ƶN9˒B!d2'ϑ!1YP{)$%cNV I0MM4$#%ruYs̢5v5r6k0r_v5-{#ػ8@ &aPETr B#4ӂek )N8Z` r /:J$Q$d7 #̇Ubj4Wև٬[~Q˹(}шcW*kD5bGrCybZZ`nJD w"T'nLU51m|cT a9虐B57m,X y"ڷY#^;/,µOj\^ԕbZe<bCWu*AO[}y;to4lmx F[f%pir.dܥ m߫ĥ%o/}Jcg-stjaН|x A'+ն;#q<2hƵjHw7I'sHzɭ& Rn$X|S+DN/PpFZC$pji#KʙgK#ЎqbdJJ'w',ԥ[lH[k_`9FΖrêx_zX p{$"7x }d"faVՋ/z =Fw2@`ܛ,@>,*)YTj٨z?Z}-,N]9e\j[VpX#D4WHDF2c)Y􊣊ثCx i0"୮jVO$ BVNK-KHܔV`9P}HZzMꥭ^Dh+pޛ*.XdWso'tl`9)u)υ M(*'QLϬţx,^{!;JұCUA/ŰF?ZҴ_iw̾}-('>䉒M1R9{@Nގ$0&!Q޷#$*};¯!@)1hnp5.yѼLcU 5rI]TpUF(Cd *& UJ=`ճ&c\gJ={`4a*&֖i(Kd:g d(9kdN\{q_@ztw-~Lc}T`2,r(ұtlL3|4weSoyì@" z)|9~BR>etw6@D<[&8`%>R2" I'96Ī,r1%0YI0{6$ / H~0Y'{6qkxHC|3Ç$IQMrƀe쮮wWW)RR%nbi Q Ƒ̹,Y2qP1{}HH)lFG7*=jCK V?|\|W`D cLcxnWBOšp)f{NB 1omwovS9-Wq!:aZGooKοٓ#9rMi`42x 2t9g8== ::,ޣOaUN-}'Rڐ &ʌ܎Y ^!5IUYG 6 A?/aN(E f-@~Ze]7kz^oq KRa D{ł@FґXie3.3ꈑ1V⡜cB)7j<۔##"( -J)@Z ј;n qR*oYuK+7^@W=V$7שnDJ '2g0;gusKpAUߺ}ݭLM - NR8Á̮#P@;4ŜjbM kéPcK2cNH@8Lf,=$"Uy+HLGj)X,x@>{glX$"﵌FMFS@_f<8vj~݀7~~m[͙ Fͮ›Oȕ#wŕIvӬ<0ோn`tE=zaf٘VbF0r~iW'o.Dm07}:3wMh~4Rꩿf'ͣ#/zE}iN?i|k MUדi/+ϭ )4BjgBoΆ+g)TAS 3Ƙcd|5Q G{ɱ$/pJÃ&΃:մ˷{/FB9UWWp/wbE5 t*L=TT"EP f*$px+`Z١WIPҮ gRv8L/"g7;BB8B0 .㼈YοNhC=; H%gO+I(P0Vz.ljKfF/qD.E2K$H⁄iAH*]W8`}G#HP@k S//u',"%pVhia8)%(qYk#k̃4Pely[$d!qw{8+-dF;K.;FAQ@8pQnno}ITIjK ^' 5sf"ܒ18%j+e/0el+ ifYH:YSY~:"[7L/K״Zz@a_Nr4 y$G `1AƼSļq`uޒTk'ZK(Ib yI4J!#.D}. ێ~<:EjNjwR>j+m(w8 /`Y­$ RZ[ah|P,<\H☁ !fE;y$>  GQ>r8Aܽ賁^VNm"ǶgD\E# m0;GU $q&@ D\yk+ho8Xòr:łKsI&M>p%b6p6(01!:QɶrQd\ ,n FрG!eLD: ָjo nlќ* CK9_y=fk׋2 QRgBRSQ`y$A-[ SnY9ịUh4BkM̨3hQ9b0XJ jCYKS r?fʡ\Sf_y,M0.Ԛ5BY+-\4`r E0]Gȏ$GC#};)S4  ү , GvnfO>L_]>ɹ 22|yCA)ۙS{\7}3%A߮]["_]+!YS쥳{зrln~xǖ0 㛎ood =!"=AZ3E4>5Jnpb΂wDðq$$1c!Ho;̝obt[>[Z0b}3vRf޴gxSLgˎ//+%1fǘcGuǡ-|ug╙bx -?TL-$ʡv*]ԳTjno8m9mxt xt$!IG!Nm׵5-}瘝}_\FÔ1%PAhVGg\N `eZ^cK]!-ڨQf63&omRsaϦ҂N@N:k[T~[WeE݉7S$Cl]zְ`HNHYOL KMS黡`G`h,t4 =Yb͇ա`Fol^e~GBʹzݥ)՞H_|0>~kQ7|yCSH `8ZL&=~wCND&Dm25{$g v0j !Ab8؀RHSQ)rQqשVDZ0ER( u+m%6!`BLJ;f2Rʃp-A[/5^#Z6p6 \Q׭օ} PUCjx}q%V;wIݣ`hy-f4נ|gSnTZGqL=11xpqHH+-S18YdA & ,@Aej,HN{")9IE$s2JaQiƝV2D8` (O XpNJp0MWY7nBrn@hmDS(E(*>A?)q$FXu/RM0AjtdQC"6A9+mA{0?Y0ਤAz1R^F*̙>s7 t FvfkYE0}BB?g_Z$K9 0$IϱY.Ҡ q/')SUy42}RŞ" xeQ>9)Uj#H)D@PDљbLYK;?*' /@ͳJ`PVtb քGK ՌwU\vxv( ' |v~|ᶊ_ޕI%o|$:99\^/~TvBD*.Fl#Ʃߧ[߯AWӹ)UQ{ԌS~uO>V7|X4Ζsi8蟝׸zۅ@$ȇgpAgDHJ/_mҾ C,'`DJF& SD5ɪ1ֺ*AWnmֽj"LRÅ0呴0h%ȯKc|{arﻙĩso~P\O(+=h|Y)>D6Pib(zaW$O}woNO)&>~ ?{AEg,s- Mm4ɺ᛬u-Ѻ|W"D !.m~j򢏱GzhMr?6J³e }7>VG5ghg3˱ SQՓ4O~M^ݼ3Z?}߫F]bcQG==xc3Pƣ~ 6 {W@Sjc%T.sf_aD]|Rl Fd0M`\oAPKGo+)}열6n-X}Yv5ٽ^ȶmod;onk[/DVa=9\ IwfiÖAZYUJBQLP28,G3Ha\l,~8 _գYr$3[)Kr6*>U]T=14֖:uE] u=7<[Z6$ PRj"Wq 2gu:%eͽmۄv{ İE~mK88۟6Ql؄Vd8>~ n)uӻ򶹾%+")?J'OO~N1 "8hYB_4"TpZ>}[D5Q4a) IVIh<29$n`2z4F+羲aA8RJ? גee~iWM3ۦܢ"D A8Zfr /p`JN [Dy;tiZߨjso;}{(^ |[]#f.lu^s2 .iXMJ+sP!/X]_Bn}~w&Yj2ij\L("8OJ#(AT9_Gc.l_,6.oܭfmkGm.n|{;ehQ1p,&DhѨb}CrU^O'Y+_ZHy .]jf#ԷկC5g轐K60c3aMMIlZ:%RSh慑V0 4J84yH4.m% m-Kot1XjNxۺto1oپ0^ߴ=7"}4![P+|r Uý7hB$nC>Y F+d7-ZVZ=N\y͈>cc&*4B\DwQ VNyh% -@d amY*Fn% W 8*K-g7m)յC?<_zʳX<]{T]<.K!$?.*[t)ȨQON!3$ 'Q4˟-ta=^x4p)Pp2"Z ~R00*4L)H5*#H A!)R As"x׊;]xc:HIwk5LxM(MK5E+[ UEDpm3=;+&"U>SX{t$GE"H%SE#b6q\BAqDrKQP=V`뻓9JI3N}3sQ ~o"ҴCn{2mDGZ&$T$+j+]?ڵ|7B\S 2@eE<5B9 q#D?Ǫ}6/Rd>KrU9}`.{Gm\v.qK;A`2񇜍dbD~>@g/zMJۊUCd[Zk%N>OdProdCf ̮qц$IO7LL#yP:I5AU9C_ZJxeZWl(czfl $ )6L+9DO@ed˟ جw<?H02eoxrF@n%)~l?~\.2ˉMLZ 3&IbbJ1, qD:G-E7 lz3ݐry+ݐj9nHyj7O7l|(\MyW6{!}^ŏƆTQOԑΚoORb?{Ç,kN<3?3xFjTup%=u,g{/-~ dV/NxYέn,x#Ɯp^W NGN Nk):\V,``h Ai$u &cf:%21]6Lhc"\f186=xrGx6F;k5\S8tTl9{O_S=Xbd5 vzJ'j1dzHG]c)⸛UvfpsPRw;zr=oËİ}uiL ֦56GPOggg Sptϣvg;&]]u7 qd#NZv>y=xc捑a:loyu{VytK\;~c`dpme6zKJ4oisQo_ j #;[ )(h%\9OG%eMPsMOBcךZ VT}tq\Rn*m5vg&95_FovQZ!Ց D/= EQ.&5m$ LMJޑhb*`<+S}xM,s*d =MDf4Te"0 KAY멶5[H,Z枅~/>澍n Κ;uֈ+DP-6 116Ԩ]5jWv5jWv5jWvRRi`Dը]ը]ը]ը]UYb檋AW#^S=&z;/?"Ԧu5\kB#ZeZ*TsBKI1ϬAōZ: .h<BRAFOʩcAJ4FqxL Ѧܩq*P4Z"@'DLaYl9{MtO<ϯn}ݬʑv~ɌZ٨yMt^ݧbނ&1SCPCwZ 4D( 2i`k\%7  4Ƃ[V[\4)ZP %Tɓ-g?ށRd`uWUi=S,ʶYɳD%-KK[%˫y*dC^2-zNaF 9H '䶬Nyt2_5e>/孚=0ڢS]pYk6yuxy}z\uynR|FCQ X/m]es/YɡoCNGl\*̆-zS$OL55`R T%Fār x d9XOKM>ૂ6_q`h?pC?Rfue’gjGd8&짼݌9WcE>Gn )[vB:/:۹}k卡Y}p8-8)3gYQ%hC LY>@Ϝwhp4HS`/1'<,O-3_@3S=&gJ2̟,^lF,L][ԒDQ7'$:N(OO/p;ܐeǗ.hB&5 oѣ rӋ&NMnGH~\CVrmpуf_p,W{?o柺~gO,X-~hxi̬v#P6y OvQ w;]݆߆l /!'̻Oxtë|q{WJu}w}VCZIfwemH02]6H\plގا NaYԐ(! I,ɬbUH|#Hy$hʣ/'_yPv_sf7>T/;ð@BsOI(_@xي_b ò%>8ʀh40KN3 Boе↦*̚8<];$_xǟ~~]~?o7??~?ѸLS(uֈ p>㯷xjzU[u.5*e{ÇB!X/~j[UctU9[S(CC Od`vucd2 E)2v~&:]@OOۨӂ1D<`SB ag1^%BUQл#s$lکwHLH\2T" [FSrEu7&Аy Z BW5Ky{U(s3/N[]kt񷺁VIIަi9Z^JxE ȬBUu17es8qߨHonuuźެv}k_ggݽL0LӬ~gtWlwgmt,ql+Aj$6h䱑Paӳ'XDQ]6yvW4r\JJR)E | { 8ac:\ GDUiU{3¿F)*,h x˴͙,+2h6D ) 镼8JImH2AK(EziӼ\HkL^MS4-14.uܸr,ϧ/Ybۜktr4)ȭ'⓹jISI_Zk  /=O^GIM9EAYAȨ1fNГ F꘭ѥhh&:Msp]&َYAf싅2 ="l̮142 ]a;vCFΆ#Glc5,̕\D :HDH48 HS0 FS*b #2R7E΄\B+["Jd1c3'@rHm*#v5q#XPv j{I㉔Y},$ B,!,17ϥ\1Y6Zothx.x#B&fHVtG#1q8ÜYFȨNڹxXMxǾ(+#"#FR` s)m08` EPD )o "EQĪh 04m$ Z ^<I%Z$JL0G-&kPո8ӻDC 'LǴYMKEUe=.nxdBbB9I Z04K0RJ! " Hw$}a5ue<=@ؖSֲ??l~_9d"zA׻C O ʋ?m>WL1L5 W2^bOro;=| n˓dÄGei(t II&)T IXG< C_J.pò^y8/k2/|C U>eh"CĹ$.JѨ֙~A @錇> 6IVe\謏PFQ5b.>X@VEq-i(/kBɳC鄉v}KdMW)JFFJ 6ZroJgaz#Qҿ02c[_._x|lV5oneG ۈK8M\;IV^E9uS-h^)&H򖁧Rg{E[<{9cϾ }R?'@O fAv nJ;[_f漍-SbCvv6 f]=֜`>P٩'}gz9>%`o f,HL7{g]sx> m|]maz}ٻՍgjN,m*~d7n0xea0R_Ƥpl?tw(YDK圗1`!df&M %s'r!T1JIU=a렌ֈBTN*T Op˹ -$ZҿP;8;9 =P/U1]抿C9R`,JB&>YJ& (rJEif$gnDG4P[yaTt6`88p*6H*!42K#N[w9\pe"'0Xai3rM2J=jilJZ0w=8;γ%~cbLDuF"{%4D\ L)sqS9֓w@NZLYVx" '{zF$,IEdAbăd]o$գZazIt˱fTV*6,@(ZSF0"H =pT*TM'I `ERߋx)Nΐ ZY%d0,z@ڔh$B] ztg[+4G;l7ĺPY<(0@qb\BgDJ/? & ߨ|"+!a i=cNI(+pLXR)K}XRX]NZ˭7lsbÒ<`v>hV~NZvRZ<Y?Msߡ1'W$0<**<*ZqpERr)b WYwaLy yq:dQ31p$Gó;5F7G%6YcڵK {MøZDo;;Z؆ &_CFם'Fgt;Nӫg+!oYXsS GwYauHd}y %6d + }"a棼K@]VpW"NG(D"aQDIu|ծ.SiwoYRX;Id3<#KƜ 9X\B>s9iYU$9j*uoD $Y1ڢDwIGo?„C^_a|XMѵ(%b'ԴIE[RH(( E?JaE)k#Us+\.j()F"Q+Q08dmsj>\^E5b>Ybѥ`D)%pMNԵHFƤHT'.p JBu$(x,;vDiF*Pi2Qh?k QLn!Vˮ+eIe&6V:D!ـ\k05Z0+k.4אu x0N+GQ7fa=gA+1*P*JvTZ2ɷ҉%'Uʔ.b ~Nc;[-V-tC*D4XYi)̨&4zd67qs;[q5V^XsxeY j3xWh%Wk1h a6 9cGŜEqN0 sG.)bD*p`R 3k'D1mť2إ~.J P@SlDC4GQF4ͬ5#yGD)(`~G!A!NM!Jۑ]RjMUR9^P(46j"b $C څ8мGwXZI@<38o0(ԿXbF\*"u̺(NcB*&6/ A;L')A"_ waV9q휉r~Y;n= PA׮z|ou/#_Z$MY 3 0RP8t>o9#+u%SdHW=$V*X( TPPj6z/3q=h5'@i\@2QӪAU k2!`ic͛@^BB&>y/d:BG-8 VgES U_~^jygE3# ld-ŀb$Sƪ`|vntry~ov}0di"YrB,G-VXB(;KSvpC*Kt]H_>1&!Gun1ᡔJ p[O hg]+u 9.z-b bhwXC-D;+f bro鎠gcyAP @I25;Ƣ(άI`&#E(ͮJPHƁgUkU=Ƃy0+ !.}$R.PB@F:QbզX5{-6'tc鬞;a$i,e4HX6 ئ;J-J =UK F㠝;<_:ڻvvu\]uL-FP.n[f= ]Z[ \;-,Fwmܚg)DzI IdƤ  0#bz8n -ȡmM1WTs\@7"fhn_[hgr%5 *P22 Q%fTiOdPH;@X 7o#2l)b*'uV"⤩Mk<)@& %&oQ^1`\8 a6 Ed$ӽİUީ3k ?{mDVc- ̩m X{uL#5zifz@IQ{/AFL(X6#TK.duN^N{i:?o*UZ&zfH֫Yë8dmicNZSti jaӈP/zCL iDF9r%#p{*xMWPO,iJ .InDqXoCn֘M5墑r7c4D2 J.YxY 4I{FCtN]XzՒ0[`4r/6F,?!t+zgjp!GzTQ~ԋB˛ybBmCz]!%fP5sXߗb}0!lkbvqm6|&ta|aEmR`#Κ ¥@dq_+!% b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J gxJ EU(`n4OF hyW߸M*@ld%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VU0'=% (6? fP$6@@ @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J?I0ЧA?%}BrPWJ%Y -*ۣg%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VQ=XZP".ꗓg?/h)^\YPVWu'eysOIp 溧#\Q}¥R,\K;4X[-'oz N ʋe}F}P,,[|oxyP䁖 Ds@CDe .ՂiSߏ;Ǒ#; `ֶ5fgS%Im1$D.} Pڣ.Fq!F.?=;~O!y)E?~?Ií5OpͶ縇CaV\{ӮJ 5vYɪ:XYEN1N BUPd9ȭҢ Уii>}̴|&ö8"롤U'uvB:;)`lC5 UT?{I%ԶK$T—)?{ u֓7ȉVw4b7yʉ[}bTw"86`(qv|nYݑ9$dJtxy)AtA Bٗ:8Fu5%va Q0Eu osxШ-jDqF JKl顩E`.=c/^M@#0k) ?iGo-?=t\-Ju_K`C v0a~`cA1;ϼxmv3pɗbv:)@)/)vkq}~`R9`%=oRO QKiN{$]PRf]B$8W)W-lZCnϱ7njt{6szsĴ1m.gSLzl.zvG\pHz=`]qMߛ2{қ*|W5Inm^V]uE"ֹtWagOg_\\tw>Y8s&Z>>ɇ/AG\֙rskl&ypAgzn.~5Ű"vﮩs(4=uo9s>ggwWbn˛Q}']N*Zu]͵/N75an ޱ'@-̍_ (uxtOYu9+c?%.&/K,@;H[Yrjj0Z!zkJEڨ3=~e侯gW'׋qlc-,VLmzmvU]|9~(>e0cG]ܲ !~~8[ﮇs4:m'7v:NƝؑޞΣūRUh 'y{L zl~^Z,.n-N½8{0>xtnNZxήn/8= ؕo煱7`rPz<UFGwT)e@CXSbZ.iKnofLJ&eW3V8b{ɢw1[wo Ft> 1Ĵs@lO}yͻ7W3Oڽ *p0!_E2d+QƜrCBgZ{H_!$]26n,d7pGkҒ_^P4Ւ({zp1UOuCϊKDYSӡ 9) )odBSS"MC٬;; cH5Yo+|aՉχ v̷vin<'{n<%ߢ{5ذJ*pb9 ʌlNeO+RcE6Ł$ki-4fݹ_7@P Xm*mJ]1XdY:U{]<Ԏ% `G>)܎lG~,8Z֚t]9q]G}Aޖ6JiEQ[ P(QP;"c2&PPI]tl`K?--TD~D|I쉗YݰLpoU}~Y:'i咿֗?̧lsI2?ZgB*ZhMk`2x%4 6X-exDxD`x`%h *] ld7l?V  < H Z6L,2'I93[)"CV|ʩѵ 3A#T!Yw>̍QZ}LgqցqhjDު-}I~'FJʄ%Yƚ:jJJN*eCFԺlm1 y'y`3 ׆(}ަ=:+hA*S_DZ[vxYlvd#īn|aZ+V`9$=+{Hn[wxRsy}[扒L&ISw4FZ .PR: 9A5 i?&=2$kڐdp.&-p?/> O4H!g/zS5Wl h#j|d"aPיMwxFj㩍{E}6Y"?jŽF_L md[-ʷlճ#rL^#eeCz_oʰ8,FY~[.+VWa]YMSߎu~Q786^(!_W(* W W4յwgd*GI7Q$p'duLwUh֕ 0ofg!Sь>v G^O`G' q*vo_&T]]>#zrx%l+LRTxGytc>YW հk9lyy&'$?_~_~|?޽w?=rA]:Ϛ7?}AӾU޺ipeQ&|vݿi0 _7Bs@߾jw,ߴ1HU2.gtj|VEš ׎k2t`O?LMPR,cSzyXoۨJ$ʘ(33X"  !e I$6"]CVSY#SqZL^蕒!1L 6(Ȑc3 \WUfhqr8jruPάdb/gVZάW۫xGQzuzg=tC=#8'[mf}t51]^pR+7crPS ڡ"\fY$SB,;} FCP53 yI@"YSg#\Vqq&{2Q ^'<8(e\}еU?]NsFӳ.@~!SG@^F?͓ܡ B_FSid=H{1?t8犾] |o=jBۻL뙘!{^N8*]2b{Ĺo2۠q%q25-BdZ ,kxVz՗^<ID"pс'AŜB $cc*BD(Ծ뭳{s5+M>vc(aeC RATP*RزB`3h']8/5t󝐛Q +EtqPhDYoT?u o7*J5xz ܫuc'=b~ŧE.'y8LgWkDE+V@@LH,8LiO&rRu ^5ElTAj,^Mel2 f+=%6E2r7%Qɺƈݬ;#v<塠v38Vj v3)F2k--J4HQ>zzAdr664 [ يNxۍ$kM@R5G"QM6xج;agԯV`<Dl&"n8  [JAZL>FW3PHaEd`8*"&dY-a,mLxn,gb!2`Tc%Z,[2֮ͺs?"~,AˇR}Ź1d<d~+eH, BZIITk W_z~nLD$_V @Fz$d21IQA 0IVv`')rY$W5ϡD 5+ !1Ilkm#G_v(e3s8l`QȒGx_S-vc5d>~Eփ+QXJ$7=pTNp&z p)fN[%"rZhY G˙E4&k[[)W|Ew&Ea5V\e[Vbz,"w3ŸVx dQry&2d4xfucE:̸ljǪ7 z2- kԤi4ːWP%Anms􍻇Isq{FЖ6vQjkA4^!@nqV Dz-}늀Ky $}5þᝫ.~!!W'q0W]@.?y ƣѢ<*mhĦl@U[l<h=8,\QxK[Yr/"n,r?e$}9O틱kiE+WJm|4^Yb<%\(-\5V%WD> ReR+AN$"sMP5Y$O痯uz,#:QQb*&֖iq%HH:3WeJtSOG jbפKyX:w6f\ޕzދiTҊGz^.ؿW"4NKFQ; Hg4Z:^Lu0,(Y/Ux~gu+r \x˰s2zilRSgTĒUTi4'+C&?@@<HmT9yO)urUɂ.e]IwX'HlxJXaOΦTĠtLR:)!'}nؘT )yL,+ʙuLKE,B0Wm̵s,W%Perx\uڔ;O}piN~W!>Ow|(Zs|:j..9`h y %S#Mq)L6ァԟp2q2369O5GBeޕ L ">:K3)TV']#5HAP0KnCBK3N mb:2}f26u+Ecs<=k.k^0l.2%y<-m>C>|KgwW~cӗRN'',i R%C0ƚHDuN2,:nxBEo BJ RxH&%lN:ʌKrO)%1n[lRfR#+=w3/[VZl͢dښfTo{Qj@CP@%=N*LeQKKV9Պ:xx2;H=y^ddʪ (@:řK5W"Rd0 9Ř᎕>9uuE=7@}fxZi6NFs$눩djsNu)Y;։^izk2Xw5C!C ^ewn 5,AVK5@(8ЁeJ.6y]%W>'"&uړ`|o6V+gZM!Ӣ߆DW[<^l:ã&Ms2:pQL*k&fJxCX,6h`6l;5Mo6K8.TfVhCÒipp-Z.ޫ܌ڤ)o:;moQПmhI8%HD mbKi>ݼ|DADs J꤅Q%V5i<@< Ov\SsJ{(vI譐N w̆$ ٽqb *K^r,5!@*^N& ;U]⛣J%z|O<\q.*kW&soPO7cORyxZh_z›wUͼɻ7'I8~av?_Ǥ#h/En4\o?J]^#LJ/Z [=kKY6yӛ?AC  Mhz9i}I>մ| ukzhjavvZ1ome:ţN&|R4TL|yA P۫MVu;dd|J<,_=)_wb'm\'X Թq,er"?}18sUfī%PG֚ !RdmBIJȳU`AT Y3),[N(%Cpȅ`I6HP'IB2c jW5;FC%Wj5 -'nt& QȔΔt?*HN 4V1mbyqofAtp%/:}ɔ6*ryNQ%sci_Xu4PCTE w:1c+P$89c EMPhFL.dJ2ֱ 8hz}JAy0d^EgKS-MwQ$8kM9ȐaV14VWZ=@N&^*Vg63Om[gCÊkՎ} n*˴A89V0Ys嬢l0Ua  [ϥQQ鱏ef:ϛ[.+VW&K~6<)IWɲ]?-Y-e/ Zp$;lgh);~:ţ0<ۙvpŵOz #c<227pMa[j+RY>jv~i#,~Ƴod1K4],R-[&e%)NS+ryk(i[=8‖SVr|\zEy9_W>?,>x0"j5#p|n/l~9Yjr8zI=u݈[EaHp4n1fbYǣr}6=^!YJ2-'ԑұ"x˭O_yPvG7ԜO`K qJiGi,.gCc e)*B %?~>X?LJDž{{?|Ok0k |~CZ{tjuM-еԋ KUyMү&@>xZ!#W\R6FJX\lN5 7-`tXw=>{!+fN%f-Ɖ/?W߶Qg$ &A$b=$L @3LY$$:;uO;D¦UwqNRYD@pR \.u12 B20amWFU"h?r)gj23y+mMg*&QTCW'kC +LĬ2c.C/ 9Q̱&AZeAm!@R!8\$g29qw&]ypOGǷ"HLZ)Y+37ظTNvVdrNJu"Px>qM]D+^#B %1B*2PfceE6Kp3!ZO6:>{59y"FY{wۜdrֆ.x[3ݵwjSYL&1/RUTCưȜ.^$ (GDQr)se"&C`VyZlʒ 6ZeRvA$ɒ&zCʵ+@62VXTj-X(z,|dŒ_l:c4=p~ ? W)gYYh#LɠrFKA s X|U[Z{d3J%R!hHF8"Jd1 3,[ vQ`bvIGWP[3MO0Lc*1;bnA`fZls1 YB U0+m| Vq !D2EBFv#R㏨T a>dT2Vxtc[DTzDq(g :9[Rhi f .*AA9TDXO$Y1mlcT n9LH!Z@6 _FzC֤p܎_^u \. 9til2.{\XEt@jn} r x DP-$}]jұ-xA#@-Չ5Ni: lsx܍uz\Lޏ9{?x@踨'ݽyx45G_Og a☲X8\LV udsrpt: whɲ3P:Ch) Ÿuʫ .9h4Tgvȑ~: 2x3؇`g} dЭ-iU`}Y]DIUR!QbEfFOdj-hP Z~< C(eՔ`T쌫c9<~O=P{fx vЍ>2=/gt#VorӨ$=CɁj,SF jZRKbLXʮRcG!P9bCv&IPլzCfg8>?=fV6 ʻky*g,A_ݘ9׷i8{?ԡCNC,i(%t:TBi'ݴ%eK( WZkN,G>D-Q.0e%FUIưb  3b"Lh .arjřL/sE^~V[^o.iօFj-FCvO c/o&nݺ `ĀWj;^/-K'5C*J4$p~@.TPJ3̓ᝄoj |-BϥDqƒ`a9E9W#,ZR%'tX$HSy%Yq..YJ`c8Sk#^l&_H9;lOE I֗Ph Y^`?B`Ti\*lkP {MYTu-C5f**AQ"[%;En՜7=`j7;fkXr,:`e*c<6"dJ2!j2^qdȫu`2iڀ*~;'BۻZg1'_?{}Q SDA"=2xY|Ai)F)2y^}\_$s?Y*DC$qIVSFvUȮ;i*A|q:PžhD,^14I悑)ePXI$URS8JOy:މ©]$S2) [Egl@oY [v(ýK/S{""5a3>F [[7oTܩ{U%]2D *p:8I1j0k1$ *BaRM1OtzA\Q dz4roF3'.5fR1J\9eIJ+GYgJ}ÜUO;6޼wa!ZhAC fxizr] P~|͔1ȫR:L]I~r"]׻iG-2,`yG'y}'?pr4<3:d%qņ#ƣL=;?|~@5pF`*1־=)}X(̀ 5 P*V'^O':t×|șL’’KJ>@JE\T R8'q=JG͟.NO2 5I#n&8ۏn\PWvyhe֨4zllRa$7am8v`Uw୰b* FpeKy +J, h|p;K[ YCնj&|ϿwG}Nhb\z"l(,,1]+X!e}$Ckҍ<rlHD~5fÍ"l+ 9ލOoۃ#E@a@Nf @i/і =YzKS'M5(C&(J"#[8KN!&s^\.=WUh#!&"$PK6&u^nlexČ E[\ Z RiK'UrbǼCLfP`"I HlLS%>)T5 Qje bWlcցCS2xg7Pw۩yF`"q|vѼg6r]|'UM6an!A@H`ȩ9fOtU|䪦n;6c Ⱦ&߸\"OR^ĤMW0ւ6Ʀlb⢪*Ɗ*VhVb!O{畩Q7sYhxz)mzr9Of[&]jAF.5wvn'_leak#7CyZ?ޗsOOncͿeJѭݏs7^Cwݩ'jS߾\$o,c6T60 Fs18u(Z|loxFsӸ`h 7#8`U3fAM+ !X}@J pU3&]5kMtլ8WFҕp8骙kPU6}OWbUS1ӕs@렟z핮\̡+OWb7z*h{'gG_n]G_ w;1-﹓]rhB1]$9b+8'5)d|`$VJ䔮I]&rt"K4(46LASD4MAAD4MA/SEУjr"D4MAAD4MAAD4n}c +& 6PfYSk|ПyDx.ο r@6APp+@ ]rNO^#9;vGa/=,VQ^ƖZkN,,E"(Y |Lgw|vO[|ϧ֔.B.%3 (ʹLdZ*189+"(']8CW%1E^3FHkfvDrU6y[Y_XzYSv#6ʥ9|1+-?'6ACۧ^~Eq5.8y!T?^S,f]PMYaʷJ:,rAVlN-rr>ESl3b-P Xu:UxЅ m0E*dBgId6E 5W?Ge\,`Pd g1ƸltU?E,!Tb hWgYbT/T#/(!e(E&s|vvBa!pRwd9H @P!_j1$+a5edWPeY+:i*A|q:i4}6ǒ*ySFHR`$8BJY@"VmU8$_|>{`XO76<;ګ6}:4قiAxe8q .$n9BYfg1 l١ F.ϾL,7z/E|sEPU%CDˠ fJ0(&cV){4{Fс5>=#ee tS>'۬PЎR2!VH%DYAj=jL1 Ŕ ruv١/5Ph&RꉕvF[)hn~EEomt!uq3#P㣣U;@mݠC0T~4P"=*ˊp҈> H'D%p[ 5% *.|3w<ڄW91iuU/}v)j-T+XZe !,F?WfvHt; $gSxL'Bds,lZ_r !hc(X0 ֛G=P{{{I9Re]b @&d֬DcrՄ91;[uklc\PɉZŌ R`jT)E%׺;^<7SzF㓋gj5YK[ @:jD\lypL8Ä%}M[9[XfLXwbsmo\6UjFfTelR6Off?k+I?QуϿ7O/de՘8dݧtyj3_OO7̷Cmjt@+S?_~ |J<-RT2Yrm#v\M9˽q*MH]rBV@!L+O: zZ$ )!KU3jj% N5ğoECڢjzE7YS_` |*7Yk 0 )gT"@" 3HS9=3&`b zR(tR4 J8.HFj܏J5,BQ y;1Œ;2//2ltyWEƍFׄ|f6VZ"\i^2 Jn48 HS'>O=[0"#MC1f{.xMtPF-%혱rULe}QKO񚉹/]M;E=ݘ% Od]us\zSecI[`1Y*7,T<D"nLG"Q c2vW~L*)"Ȓ樅dMr2"V~DxU{>:icqQUEbo2QDJ EQ 8(&1Xh jG0F!(4 H]a5x,xȃ@XŤ q .'o _([c_9sqo*/2]~W4q#41EEDAȵ6: D.$M58V0X{3D(Tz`x`I2 !\Ò[E6 "'WnT@֬n2h$Nb$Yșh&hq F7u8+[q;p%*Flcifi~q/Iǫ̛ B*lzP*P_?}*sVDY=ifq#53r4I@IF-vwIvo;=Rv"n֮>4it}q3qQ=kǓb\V:sDQN/mH4(m*2êt|RYiPxczgBȤbj}r,kZM-a|ok|}vxQ$ֶ4.˯OE+]ERyd -r6#£UdIVRzS$Hɔ`r^̃)xQ/0‰uReʗ,Vk-E ta6eF/ ehT{!CcP'i 1bHEҋu^JGRIrSr`1)IƲR\nVlo1>tEyzmi|[2]<ٔeIVQz=Е "Bo8A[A[ɺQLgżE{daJ _ GqjN\qI&C)i"$5/}`ޅQY sl|rY}2؊ w'>eL׬$�d ƏщVBGn6Kg~h6'<;J#,[kKIr\`,yJl&-E3M0^qѢ4yb7͇ypZ Y췫OHjhPVn9ZQ14ɎP CWC?.h8-eæWTԿnizh^܌us͏]Oxw! }hxr[aTmEɻS7^O(Eƺz⼳'wt Yg7ו=-ĘƓh,|2}^Mpipr zjCZұ$dʣ/f_yPe4VdLS< e%G%:ISFY# :bnJ8)uL\?"fYzp<ѝBYڃq:[t ;'%GcM dFN?omTDlT҃&T'<78}X~+~}&eiб*T`FUb&sJwT>JT>A]o uQ' zp!Bw8Geb U6F])eH0$ ^Xe \B2Ez-PC%`Ce@5U>&)QFP"xH䞒h#!J P1HT4\9G8,Y,jR9* {x#g(k!\>ԿD:z>)X RD!hMPAPI M%h7Ꭻ*%mc"ZJ/9IsiN((f#w($2鲫"blxqV94gWh_J b|S ZK<¿l2]55Z^n/,ȓelCOlwZ/Yx(|ᙢocR 0KQѩ ydĔpb(ѐ@(J!$HN^(E^{-TH`c(Vd@pJ#c1r6#c9]]Bc^pEQ"kÇIZ<̧isqݽǣz#6hhRQʂ",T+Z(Fx>t +,q\l Ls L*/<(jPQBێh("72Lҥ JtnŸPPk82㔋[WAQMgP$D͋aRY2jъ,HC$h7F#*Qu!@<QY[~zgQǮ( #"Gĵ@D34IGtJ9Dv *D@Sy6 ,ED8QpKx52JP,jђ\1К̵ #b1r6#ӈ!.U+\tYKvEYE=.xN"25h)1i)H"7ЀA@i Cch{\. ]P/a+y"%V q }hnwCe/構6R/gKƧ g `!+&(Rܖ%A& h#i,J0Z"9.io5EpI{S%(Tɓw(,Y_Ϧy~lSΠ|ئXXvEoY>eI܃eMڑ=ّHzvg6_` zkPn )HN(ldHBGE+ebRm%8$,{yvs)ҽR>#qq6v-"ǝx6uM 5ŗ ܁SUJ+(-9Qz FD$h*p%eup="2/lC O .Y!lLq B;#4uhsX7QF"$U`"e]@qI|$ p\S&(tp2*(wc&ҵ$&.FNG>>5yYTsA5M>Ljr1etrI &p&SRr)8P@KfN%de Բd%Yp!0|)xf6|=?OGmwcw~fӳݠ;u7] 90^8!HJ4uei4+բNi\f(w $8l5?NlޕرwEߦe);mfqPsp{ b [5B) ~:jQ s#aLQv[|ՃQM~{umlLpcF؎9-n˱mv A93@ea]ǻb?7Y_XP]LFz4Y=d7u56ڍ֛{fzCoKXo j62(cV{C3l޶Ǽ; 6ыT/۫5-mv杲7,LD ?H]F]To765zL' *:03S- XIb,I M9̕F|}4 ܉sP,VShD^. E 6Yޜ{tno5?t[= |W]nUIIɄ 덒  ƀԂH  :E O ~f_#H6#>`c̅BiG9S.%C΄oc2ddo}ӑ%VOHK m.7QQT46 P=24B%TK+;yL[cjuougz+FUd򥬪5'T)T[9OZWխEm+'ZDŀ5p1 Sq&{2K <9b]J'j]Ѻ}m٫֔ϛz?XIX_Kn]CN;z!vd 8zv4WSxW;l^JdߔƓ^d&F1_뽠~`ټ6y{ 0yᮨJ;EǓϏ3xeS˘ïU|B3w&L8kWX8+gofx\زzQ`؏~fΒ'WO̸s2[:xJ85BMmg:Y'/R9[O=6ތ< ,y ,hVsv>9 _`/g Gaomo.2/#^7>'KSO'㋣J|N R>]0w;"Gݬ:#_ }|8v pdI." e䆠QPy^&Չ"kkFz[zzs-"@)mqp<>c&ӥƢC/h彳nػ.]ˌCz~R4:WҚGt2:v_2gד_F׿͝cG?{6|8_Y>q6{m-ZC_9BTOZf?_UZzvM:ҥWg@D:Հt~L_(2uT=]{(|#6`xT4*<:d8"a^g^_/{I6PP^`Mt \Ȥ5JPГVcw> 8Ti#Ø8_TU 妀V %8_)WjWۧ×(x p!T!/Cz:irJ ;A)0x 9{.3ݧ&'r]ïޮkC|2 x%: 2YbτlRNOKPT4u1s C|JjdeeBs"W;#re=Җ<,)8`tу|No y$Mbue1(M/A.A̠OțQnjul RP64&٤bq˻'k˜D+scs-6[Wܙ?bȻ,^ff^WIW_*-UH4S `l).Nψ wij+mK[ mNB"%fB1"Qyb D2Nrhrm1SaMY ڠ)HɓJBZUo4JleÌb.%B3_\w.fy+z1e!|F&sO<ň)Gb_BHL" Mtŷs_$|"&^0 +!!"iLRh,ŐCԎj!N3*:e*[CV[A22rfBHL! $Ep<8vJtO|#S)*dFTE2杮 ]9$F‘&*zlF[2ȶ7| - (J9GM #a -$MT,(Q ™ax4K2֥睙ZyȖEMWoY{iXS8~T6&]ߤ7{"Qw,6$ MlOgm<:=]8~I'6luM4Aum!NMOz/s+5gK̎:s- sH[wR̼DȁfI\$-mnbJif J鐔E:V8_LRO&)DιFF5V:k9(K\_{Kx7e<}`sb)3>6gL_J|!qƻ'N(ל|ךEj&kӽ~k<]<5y׎op¥jw柧g>`֟24\pyiӚׯvO]M_/StnqknswL8ܲ[7?}{W>\y3&=[7IlmJTxN}Ty:9ĕg}隋_4t]fʼU V|:C2ldXq 9:(}5rb͍`^ZיVK*kȗuVJ[+%P-aG7{WlSzoચv_ZkԮUp=•҈Ĵ>xۻ-W@F+˚oB$[å&*ީSAYvnCbJRYE6W8Ϸv =3wZP{>|6?NO<4]S҇Nrt{Vf7vRF("b5d*,.KnW}_ Fi,r^gzQuo_?tQHr;Wځ6b \#ܾUVUrK2=j{W\/Z%ÕuI:>J?`5͞VSS;+ή,.>4L-`oAͬ;|{%Xu:ӕdnwwZ-. ;, t(&䴬Xd||L_n{?wRȬa$wͣ?o#q]}}F_ "o6mvfhUb>f>p ItfGs@&13[A elAIO1"{gj׺fp0UL̷[td<=;Jev~xlQFE7PDo:rjPrvHQmq#ODWINF.'4w7ӑP.֪{ik]itPr yI@"Yd|FrT^Gd.M]N({2 퐮6X_ǹNFU?/4<^2BeFNQ{3eݣ=O>/O(UHnwGꕌܭLUe8:~[MJOڦE]%'-k%iXh=YE/UaĊ!fpܧBi%ķ;".A#3e4FdS!.ׅL0dMT60>A M-LAAY4Y@2?)aeC RATP*֗¡M<:-y %5Dȗ  aAb!sVx63g ^i)Mnw3?J:+y.zYlW'Ix_@^GIfS))1oUREQ2Ю)2gK%FtI@2Pri2ʎGbV۩ 5X6`4L8ȁEx+sF8>+!殠v3Vj z Eh#YAt C-oܢJ#9:D&MhcCNrLt9GI2ฑ$GLsY$Ɇ0A&Lw\+ucDD CrJ(B .%SUr6!ͪ>w ȱ1iTe8sLɐȥ=$Ѫ`91acDlfÈxyH#tKkͼd[\4qQ8}b6ة4L`̣C*E(!+& -( +x;C|ְh^7~|ݏƣig1CFy=7>Wx6Hg %CSu(<%:'$yPm6Czh| 1jgJ&IG@m BN$YjRFfԩ)(ʦJ5uA*L0 ) %\& B1nfΆ '1t2y_dl"TU)C錜%x- 8DK-^|ո6ml+z,KczI2dBra/%P|"Le_,dрT;¶ɹX6yzzמ^^^Ni_ywAn>ŹT$6܁p6GNr#*RѸ! dFz7>|զtH>18r6owaP9q' B]Iz{_ԈP> !< 2FLk*x!IJ`TEj9@zv&^#G#d^nظW bx%% 5,iUhP4u4W'piH4uZn#IBV4,0KmBLIP<՞Jjr:V sb{ ^wZn]OɇE5g'Vl,Ӳ`äF-]9\@%@BzK2D|ىŵ2QNʫ^txƓyUO5yCH.ZBy:<0|U;8Ysx(*Lp S"Xw+l)ETAD/ahzLoз]h".CC PԵe7 ?ȎRJ< p.A90ckjj|;L‰H%! M6P\ҥepjrh+NIH x%jl\Uq:5=FFi<QLsqra6@*N 9B#(L14Nl|yGߎbuE+~U`* ;V[&F ܍á )0êen09kP*0gk7lH @?J|޵ܱwEzk%ݬ#GAUH"EwSWr }~i~+j@O1DNQZ!RR/-@isfD8T1SN\D:\h\8AxgKP7y)Z9[^\'u'Ptx)V\j- YCdV;&uv?;=+|]nU7N$HVBH9EA2(LH}uKax5ͥ~W< "*C!|;Oyna݆ҡ =HfrF"91`,堈akO~:5!ДPӳfzniAr5Ag w%AC #*ꤱwr k:^];NP&nPs"7DmŜ{d!hi3Ui[է.PJۜ<^Dljx_E[#j)p]38<.h:C%xqC[6E .K6\+Pڊx 2vtvF*a"#Xy?Mq*EҬoM&:99]\/Tt!LrZ?$GBrSGVSWWB[(Uq;sjZuf_+w/Ճfobl1( gr{K"dяWYޜbq0z-i֑\?E0}AZYfw(FJG .GÛBOo\ f&'D먌lu>ɶQ[ji>zT>ߍ'!w}5E?Ggk;շ XAXh[?^(~3|v|3?%|UlB0[QfQ,+E_LtvmbTr?)F" .QHN}ow__ߞ~S9}k e8&G@ o~|жq܈-fκYmt-~2B|;p+bC: 8rkfÂ`vRZlN A:p~87O:tOG8@`@Hre>ITe_:'nzx|S:I(BYD VXrZ Wk31psT7;2ׄe%ĸ 圹RSa|eh^ET-PBTtXgEsuVxIx5zx+n[_ʢ4m(<F*D*3Jƕ^ ))K@ 1.<Ŋm]vTt_M3Wer&p@ N3@D̢fR GcBM܃ Qy "$jD)1Nꌜ-rj(CNIΆ9IyZ6I-?}$~HȷO.á{7=:G^roxspTFQYW +h<Y rPIbK/-ӣP۬҃#[kz)ŚZZJ,mPSR$*ɭ5 s709ܞZ8xb(~}UGH۫?GF㏣O?HS{㡶i;9ڡX[2X?r~2n7. 'dHr)+x(vF6D+|HAz7`i>+!K7;c.Tr{?5좇 ԻF ǫτW-8͋yZgj&akS/bY6?gUv:)Mշ_WEɼSu:`O3=7?}_.:SPv~ߢlQX܌& R{4pٖƙ8;Q꣓!y_:t)BWFB~sG-GTbmHS9祦E BÉ!kR*[K>s!Mx,W/H1RT`7)%q%~f! M@zKng[6CISc9A)*0ʨW#(Uk"e<[ei &:nlrHr\#WPddFhϔ&9eA[AL2R-c9댜-_ݸPw:+2bH_)G@^ LS8423T;fV{/υE~nZ t lsJGEO%(H"0"EJ 덤V {;Vj)e9QRub5e8 q. HQQ%`2|}88*Tg<$jxm:*3!49d!d+ DOi@nJAIa(1 A l9.:֝Gi s43_g,َoe P^Y HB3 UuHkC$ %1c09+Q:V:>-]_MZ*cjR6M>Dr>6ٻ6d C&},XHCRCCH$6`䌺uUu|#}l pIq#+y z}qނJP‚7B1Wmrֱ$IOqsgjdtB̹x*e5ĨnQ'($g(g3ڽ΅H q& R>QHS)OmFzy. RÖCZ9Ɲ9\0VX>漏-$g&Eٓ#X/.{t򣺺d bמ竬Jz*TQq`$oۻxtZbwȎ^cԳU[ ,3Į55mfpz̮izz o`8n\jÇ3;ܦD$aÍ,n:I'H)xEt|·緫۫k?`l6 YMwͮ{-߼iP-/z?͞6Bߌy=?l7m_]pbп鲷.z}5gß]Ii}wSYj%v}BbYK_܊n'\juZEhQT*+&{DMVmnh `p4pɕX*SkR zpf^4ީm?\U^c/A?hi\}\8a~g̴%1wf̍>C98wU_}F|N;⧩ysSe_)hƖo62ȌJ^iW;H9wX#+cCrGgjBmce*A KH!UpY09'1i4)vp4M&Yw'&S{LbѼT\!@U&W±R+=tT2Q• rDpGW\Z*S z%•)8"B9WGW\q4p]pw=PB*ƿHR@ x2@2:\e*.pJS1IW`qYVAbrBhAOgl4蔞`w;sfȂKv,gZިL#ox'_t_e,觹n.7BW"-UP0.xBcq48**P!,pHo.S`=ѶYs_/p> ]y{O 7^6y]^`y&7ZǩhBjΫ8H4`!P&̀Ia-M\} TiuY ]6>7lGΓ#xuĹ i@C >jb0ߎl}^[ RPRwoYԬ7m@\Pl/jl’o\H %RTM(UJՄR5TM(U^DR5de,YKVR5TM(UJՄR5TM(UJՄR5TM(UJJrQQRږ%mIi[RږEzHeЋU(Wʣ*J!R*""&##$D=,X\8 Zui3FHIh~ݼH;o&;xw8mH4Q<\7~:Q5-{B)R;?zܜ3Zb=JOʲڼ"hbry"Nyj0 ޳D8wMZUt ̈́RH#‰ଶڡeL  вR91hs|J_2%]_v&a?{]+ʏ7= _,ZxΛ0mti%rx"7*1DjAAC9.ywDkxE|a]9DYf[$.gJFɃLWIi 6 HB!J.2(jAcPq|8 d̺WNcpQ0mDp6$Bi$.x>@vE ee۟hڄhmP,^&&XjZW<@ #H˼e o1!r5zء,z3ps̉L(q^ZbP˷"1;/*8$od|23C㷯 >e1u55~Gye2lȂQȶtᄝ G%9ܟغw}R/NbTgZhwVQ Fbl 8!]\[?]I"a>; laZ-cxH3fi^D_/_U8\ S{-r rI~fh%xٍ(5I>tnivgz'?5Ox(=@ bN̅S|2킡ZqEۋ7@Ȳ[{X [ ,nȈ! F Y'߻Mt}g;^582 t&kHX`'R'_L.glTL4Sqf9~[~}vp1^9(ʛ:a+bb1?mN2bw?'C';;kV߬kaYN&:%Nf"4QNXfnҗ-e~P8[毕:Ҿ\l|i&_4`'l7 T[u}Èh2ۥ;I^R81&:P@BKVA$pr)1ʩ6p.LF#3$V-$z햗P\ht`8g֣p(9CPq.aZ{\2TBfK:/;nߥc #[YDo'B{\UѶWQ:_zg?=TRQUoi߭3OIJ)) ^8 "S \2{N#S$BBRBrha!qz^ q,4rUȫTrKFY*aak;c[,-c!+XXx2KƧ4ya^|'/T|uWI`I`QdZ>ha2jJ9`}qsL8 YCpg6T^3((nGI L*t9"g=bA?,ڭCAm^PH+ &nrQƒ!OjPrsL8$$q JB*&!N !dHh#*,Dzcd=s.$G:*k[YR:F30 "?ED2"BAĂ+D)$39EH񂡈 Q8йA\H B|FSBԍN vKQ.seT}eUM2n ,W)bJ|XhYIdjG1{.p(rͳ{j?GU?/_zÕi6ܯ9R>>?Z0{ FS*, 1Z9!ieqĢh/f'2{7{C6{=:ˌnZaٮ#|s!I `M#$:RiD2m$J0OTGanfXpX}kߏǝ+rbN#YTHO(ˎY:*4)|tN9# 1=u. r=noM N|wmW^K)w|iCWkmt 7=qP`2`A0""Y^q5 0aI2 !\Ò[E1P "'WnT Rq Nk6n2h$$HJfbb[Qk⋔Sp%'=HLv(;Ȇa4vZ-}v*}Zӏ aDqjm>mV3ǞL?69WnxReϓ&SQz)VH2LrI%d0JRXcYyU 8gv|5=6˓Q4SIHft ܉HD1 |Cȑ.ZC V?6]oWU&A.]m`Gq1}$E# q~|tGB<޽I7؃g0Os^ZjtlJƠ(|Ytb<pOaMo7&?EZv垿͊ y;՝>݀Gg"j3m^s A,͟3+{sp*!y=9<B^'\A[+~2/oK£ȊXDOt |{h$g׫*-7)YM\'6ʼny*b=1 v̼UFqka4l8@t-{/VOҪݺR Z:,ZjaАyR` KZ) \7S,b&G9]Y|}#U?PFt“oP12(GйpO.YFFoW$0fUv4ٸtDq³Uܱw J okRtJ @ cpdC&XܵNlB.K)L'Kc稒0J9Rl,I/掦g64UϮ1*: "Kbn| B(4* 脱ck. HԞ+xFnIZ[&T + 2M[)~@=kWfA-(AaGE,-UuŰDJ# RR&c"+C=N{ 0p:0lR3r² TD$ *W:5'8i}ħi[tkY*3BqOњ28I@!9qѠغP Ƿ:M&:? 3ކ7=!tTBwR\4W$VV   4$`&ɤkA^!>ܥԒB/EFcώ|v|Кec2 7LK^,H'im1IPQ\}xxX,Hݩb_/9wn~@j_?fOUwS?Ж߆ԋ?oo7f>qhQ+u70/7]f2{pg7gpR*ɝoh }W4})8nR0'C@e~Q< .&m\^zqDNۙĺl4 &ptri1~t°.p+K~=U5]^5]^5]^5]^5]^.PUJQ1*Q8Gbjn8zV1*Q8GbUqT㤊qT1*Q8GU*Q8GbUqTM r6^8+Q8GbU;(V1T1*Q8GbU`Z_PSb&]Ԥp͹75!.I jSﰩ 6*Q H+Q8GbUqT1*Q8zɁV\@0{~61;4%WOIFJWݿsk~K_i,uZ^G_tf?\_ǫNdӧn~-y脾AJ+䌉3P hvha-&Đ/ݯ֎ |kkO7!rְZ TϷ[)[ָK{X=YWoOCyr^*e<}|YܫEHhop2oGHEzj1A| Wy5wպuIAjJ`ҙDQ9J͂K ua],KK*c&"g EnFw}ݯO^DAӨȟ56X ǁC"s#@ ˽ˉGPx s#Z6YTŒ{_RR2R|qA1,F3epVzU:au:|N߱M˻/m康i:kOf<|M9SO/ezftoip څ͗-{P&(f#t,r]tvetӍO^tqf^{fKF64YIp^W&*@yk~y[~Cd~6ÚRWv_=@8~6= >]֬4؈zCrޏy`K?(6cذDV֡Dۑ[_&65q+#rTD$'G{z_(|7k[.4/W%|XJO{D1/w#"]& (er* *'gD 8}~ v^lBs_åtS~uUǰ\o$wRyT!{n%v]6ZUC e(NVo͔^1m| wͪ,}|2FGI, &,6VKl1d9:Yÿ+4r;$0KKNֺEiG^A7OJ>w+. T(-NɅ* RzO%5&SȼQ((mcMk[v;j!愷\"+ޓ51Yրu#r/X@HnѴX`-D\IN⽸ϑdёdClCrpf8/N!]is)(k=նw(ݻ_5k@&X뱭v$|5Yٺ:?>:DLLqy/3Z鞏v7[ߤ2>~2d $ox<*UJ+(-;R #"o@s.)a?yy0t&.(GQHyUŤg)ژR&HqIXHW_o唡 C9n%yE Ynl|cs"vrY-'z&mr4*fPNIMD^%S6p:cJxh<=BO<_$jX`,;eJԳB`S>l0$u O׭d?L3%8~bxs"koF"*V:KZ*H)p/<8 EmGbưmz[hٳ%, m"^BQiͮ68 = 7ϛ_84tP!*8gqKO eaR%W,#BXH;x[b"p5`vn!¦GEiQɗWF}aUهH hL:|v8?Gp?9h|ǓdBdLJdt4Ύ,  L(d)1 -1w2[,U%`x;YLxtQdolBzg%ꖉm /,90hp$%z~XĂ̾P1g/F젥#a~;mZoкܢuxJol}1Eߦƕ~s?œKpآ[ /qr?Xm 5uC3 *7$5Z0jp)_ KҶ!U+nucF؎qF6~n{9R%kj!^- mە5 m`6=+4kL7k[P# F)N?!(G`E m*ƕ&ьyheR飦}kqVDb${:~4[ ͏Bo o6%Gn^mռ޵yMBcvV|iHLȽߺ+bŊ5IŮM&VMn[ߣ٭b$%dު7W֦zf?U aNf4@יqYgPNv߲<9\fj=45/2A[ڢۢ1|'lyg-fbҰɬYuMܛwZ˗쵙+-Ȇt]9@q K RR- XIbNƫ,I ҋ܇NS|y xp}xaz8ˣՔ98Q.0eKF2tXQ|5]:aImJ$BJә.okg><-kۋDCGz&yCS{P'rcH=w=fxўTg3sՒg'*)KJD)dPGP^pD 0PD3QUQ`cJI`\qYmcLD"FT2!֨ebAi;i!Kg{xisۢuz.|~鹽eYnh]4V+py/'zre F5VdZu"$wjG9˛dTH,y4Uܫ4p˓6\xP;"ZQ2wrM ~ w&#b :76쯜Aࢠ$`DGmH e`|Bk@k繰\ݱz;nid]TFkb*r51 T-XAxr Vh6Hr(1Ţ mi'u5eBli'bM V#~kaq'iT?3O9Eߡ6"rc]tfFKgpIL?/>7̟„/񵛓G1Uwj[#Ш dk"x0B.љGv2np}4"^Ł͞jIYH''Jބk?óEq8OA[O#͌Əi}.m $Y™}9k^D''Ӌz8\Є*Mrޮ#[φ&L78+<+sEszWy=ۅ\kM|el1ł ?Ly3풡Ymմ.`;×(m퉋=]uCۻᤵefy1(%نpp=Mt8o핒kX 7d|.>?>7}5AJU8#FeFb~/nwW^ wϗk]` |e&*̿R51M&>CUDNb<|7_~{}ۛW߿=O_WxeZ-{0 >x|]R]c[tMz6[t-p_|3^j(C&rrk@zU[+eq7J$PhU\}~5~v1:r#U}e74]~3[S&:P[k 4Y#g)Ձjc2dd=udz-$DŽ6Qoƨ(\*`Sg*NR\PN իТiw UN||}=*7rwVȀܓ8Ij@VE }{6)"ξQ)Z9Hs=nR/Oۄ2+½l;?wpfOşh .twZg-Qmp!&S;c$:OZ#(ӮUT -L2.bwxoWPc`πP}CJG} 减ϯ#H^BК V&%T8ބ':+$z[tF]w@2>. S>rDQ&P N)}ZI?X9_nty}(b|u}rdZ^ސ'uRY>MnUTOݒϥXfQT"g'$0`FqKdV]2yCNDC"*#ZX<tH (BOkB2ƝhUnTid,Ffd,'b+XH;,{RE:U[24v}&?L'ygA2[a4 4Q(eAJJq-LVF#Y %7GSTIIgP$D͋aRY2JQ,HC$7F@(Ctu!@<*QY[k~1 b1EaD"vODTCtXN5iN(Q񂢈 Q8ALH@"@Tu(8%3.8yQbQKbV6I/;\͈qp㬐i*@*U/~~45f/Iۛ_y"{c'oNE.sF{'#܏^W r)G_~ar_j(x)2IRV8t#Vn}gߋƿfNAd1MD8t֐V٫mf|?$/o|v{'U 3I5PZe-TM'ڥ5;/i0~gz5N=jJF]5WwQX] R/ǣ\h&j tXiФapOa֨sWD5]14'O.[S1tϑufE!w>yػô'Vj>qH@ &4<ѿROtrpp@p -={f nܨ`,B&R!PAFZԴA9u,ȜQ\$Be2-jA"=刈)l+B}; C.M@qIQ૭'j$pw%˶Ve:SClyW䐜qf8)FΖSg}} ;D$4m[uA:[]P٫yV0ԧ?sR:0>%żM8cȧ+(&(Rܖ]Ύ40uv$J7  4Ƃ[Vۜ\Ҟ-(4x, U1v1r6KJ=+d2K}Ϟ|=[mSN~LνX6icY. nLJ( >6_-'dOkd5'&b~SA NΙOV0d3> *g6,Ejye( BRV)RB6(:D\ԃkj.NF΁r xU`=- cV#~'Y>!#5j|lnuk vO> jg<9%5)~a^2eh)Ь1֩$<OLg=s@ y5,0"22A4Gb,Y z\ -x ާ $@"ȓuGT>D #dzzω h[cJ,u#/k L'mAIR0AW mբGf?J(YDԃҖm&_" K%@< p&AUO8aAs8:.ֹu*l҄21MGVJ}i}|Ki:׍pQҪ&qL/>N6󺉚c|+tN҄ZJMoM5: Imh؃Hs GC" h诗談gQGeRgKO-D:ǎ #D)ɹQwGl4 y6έ{hiε~4[_t톒n7j^<ݦy!0A?cJw#MGK+"R%mu7)e!aLPn+nzk]Mcݾf@} ~К&!I)YVWk 5ŎŎn>t'@\"wΫרo<p6c$.dY4ᯙջ'&39CsPԙKoon_ڼӍtk3d.-?ڮYv?,˭n&j%Jx"7Jr.c@R "'m$:E OB(Q2%@qD * C!qlc :K 6쯜Aࢠ:gx'!z7T3%qhσsa_s&iA.*%Az=̚D %w m 8XTTVvrZ[NP*ĖnP}"H}6[(?onФH`ஆE)8L>6b8©刮x;Iԃ?/\OSSLN\ܜ<%npǘt~"7F%9:S1`upz xqGjIYH''JaAn k;>;\dzXx~y@צT߼k]Ӣ&|~+o/B!uBAy[ _~X<{y!@~c~w0mVD9 $xbD Z }@ƨS&:P[ 4Y#g)!1d2`}n F=cBE7K czT.03 h .'TkKU@hpUjÊNaa=(j'VwV耇x/oep[% xŝVGE:S0DsHsFVƆTp\P\S/.-<odŷv'B,B U-KRp!&S;c$:OZ#(TӮWU{ $+L2!bw*~r5 ƻ8M㸩V7ȏ#( Go$7(߽-}6ߏ'=M}d=}>H^;i^eѴp0B}[ג֮[rbm4i}!M *4>>nT65huoǧx"hJY+]Q`Z=/ށ*Ƶ -bQ8T TScR(eU+BdJ))1t bQ'(Eh悷<E F%`)MfQQYIW-k!{nuqZBК +JnWSTJPDN=-*c4G>D HG4vGPHe9҉"blqidh]^<99m=-/aȓ&GSjeSG5-5w9kd*[ĘRxht*H7D(oe0%g=<4$(2JR9M@Ā"D*$h1hDPVA2 MR9R qƮX c!=`\d]e͸'Gɓ9ƨlTO~BeGѰ}刭0T% 8&J+O<(gXQf%Ӑ= scM&I5(mGIT&^ˆ]͈&db j}Amv@Q)3NH Qx!ߙş`PssT8E%$q 5@BԼ(&!N h!fh"4Dvc %R.$G:*k a1r6acf`D,"(D\' "I:j,ps4'(xAQEh"o"4P-m4"G8 `z ) keX>_\1КΗ.FfDxqb=όOb\+.¸(xu ]Dm<%&-I0(vQ9PiF\| \<,*;4?m(і0iȭҐ1>%믗aMAp/|爎]};6^LŇOf "Ԧ꘬4@ĹZ ad a a/ gZzNQYܨѲc 2G 4d$ Uz5Ա sN0FqxL mB0Ne[N,޵q$ЗM/k{% ~5~Tˌ$!)[>DI$EICRƀUuWӑ 5#gkc5/ӝ{}zTVJg1U|yfb^̓Rsq,c9= $>uNƐ\d8ږCI͗-'ɖTQ &[K`wy.NEo}ɎCQ،C4D-= U3ȹ]g ~W%?{O٠\nSL,vEkYȲE7nY֪~{Q(LH.f\r+Pޑ |t:քܒ iH6&B.!{J9MP$tU51ffd؏d % kAޅT DSa27}49Щ^~rq1,Zn ԏ'e_(I -l;V_&`-iIRE]U-S\kq}AJۥvaT;\Ԕi&2J'DTޅ3Ff/'SG胲GR}2 6 W\Q5Yu͉]fa} Smw9&~U_L:9}s}<`fO;i|MKk|RIu /;čo[,L_}I7i:)5T|5$b\ bU*,c#WV|U}6I˟>(ICCB0Jb̤1BUJ1_ dSRf1&I:ٰF 8LRH>i 2Tݧl弑znt2C2m2?"Po+b˦xp!4"DΔU9iY$Lq\)M9L$ UP6&aٸG,ł3*r%JHR%"6eӎGf5+ge/B;Ɖ[(Y qLR\Y`6[[k_5FΆre=J aTz/pM|rw Љ{-EnxD& Q" p?|y.?U]P @qo" TRr.)zF ,JɣHjjik'vҵ:S;Yrs00 "$Gװ͊nl,"w3ŸV%@ ʐQխx4(s0Wi{Jfc^tO=5YIU7fAH r ݸ&MH2`Vr2b87Qxr** |4DPHڿTE{]85t>n;!)E HcP(k&2"%cD&6[gd[ wLa >3&59\Prθ"~r6{~a,7&qk+1n-:Z`{kgm)ѴAhB*\uNq((J* 2W\*ID0h[vd'_)=j5,Fbhm&G\$9sY& ĹK笑9պ}APlt;Ah|_ Omm÷PoGPO ֠ޕb՗7ItEҜ CN=?[p! 2iXF+KnmT PF#$|Ik ЬBN:3K< XKƐj0 Y&a?Xc#8,%y}4|Y͆Y١ K/C4WΗ9WW#qz2 s?P |2>ն|,wBE%9d@A0 o#;EoAv OgS+n.6ńhc+%ovd߀m;Udt%x4ԑl'q/t_ 2 VF FBICWW4s>5W>GUq<^E{/e[O@MBAL׮׮ڵ^kzZ] 2J^kzZ]kv׮ڵ^b61Algo>?5nn) = . JfDs-i#>uiW8┗Wt.~ؿlȜ4=2 eN"d&q@0)Qࢷ >^I8;9m\$6Nж&̎L-B^irã?3 +V$6wĠLB(+c_?kEK\1 ݯ D^%݋M,ZOq=ޖq72--y9:S_Uhbd!snfAtp$x>"Pg|f,) Qd`0M6l,jH"Г "{A]f(bcM,u^ +] &oRfP r@FfEXCXXi4>h["(3n[ 3-?ִ YhJ;f3xIhH{e]|9k ऑhoYAZuQᕻBewGl)va~b_;2]λijF|"kˋGǝ:f"iwyP>FEFuο'n\fOjGnΫ& eI:׌L,Up-m8;<6ZJ.,~;I ⩫f]>2#s:(\o xTt aƹ4_>a/+׊dJfM5˗/gU4]n0qV4SSͧ6iL[HMAG|\ӴugҼ]obl91g˧ϓrmEjq֛Ͼ7_SV4ɟt;^6٪aafY> &kE+XVB.u&rT֏:QW })$rZ3)+bu)K/!'yK;5 Xnk[)?u~M{_轌$H&L/"?d n o~ЮiCFg=]Mmt+q5)ƳTK~@`_N"Xp)8+6Үb c~`E՝e8;ƟW'P\Uce -,̩Ĭ%S8JgЏ>|#;?Qg$ &A$B=&km+GEvY3@cӓK`fzh4x):Ȓ[߷xtȲuLrru(xX HdQ&m⤧$:;utdM+ gpV"KsPBGǤ }$ҀБpB} `Hqk#-W%gWOdߍcFĝ籈y#ϛoJ{_XwWQ'u辸6;/>"!Tђ\uNeVShYԶhVcIix$oD9ΚX d!B#Ӎ)7ΕdIG~!d^nyv}v}p8yaiV70|&__#1G|ÆTfN? ZV%Gx%hQPn4q~dkY_a:8sr '3p,2%}▛Yke-kz;eG5_p d{8ޏHwd>#tlQ5ϻT7of a 阢<һ2Zz7Ⱦ*n_D:pw$hɲT:˔*/RǘɑTU>Zɯg&FXvMGimݙuzCw ٻynJxY:9=X& >u ckYeν6P2f d#ْHW`T6x.dvYt*zK2gFn3z0hZxY;j/@',oʍ|b/j͠|ئ[޲|&˒ 7f%!JdF'HYF&f94AihuyR|y$;Oَ|tn|Jk"+[wPnB(XA+/4Fk yo!oT$;L\7ED(2^ 7Q+ǹVWL$<8Ō˙h8j yy@v/7~a^m3O!>CTJYݛv^G> hW8C=ԑB:-\Q2g$M+KoR&ԭvӂRcmv|ߣq2aGD= >ǭl|#mwZ|ǹ.f<8u?"zt12-/fJݻ#Й[(LЕֻ{f>9L 55<2I]bEԵ;엝Qv=m:g;dSL꡿2„lx4=^Y1 5N4COl79ŦTwf6cLwg(pA 47 l $VAa)g*ןT9@<*]U xr8c6$aWY\'|{˗,"8b10Y@䨜cIF21J`Zzfu/c5T](ՕɾCiXn.GMKPץPZE,zggr6Q`[:\.m<0e`G)?.0 y=J?% 1~GxdDg1UAryzϟ41R?҅?']29q|LGLd8b92Nm`@j#%Q,BBmc0}v>i kw7B §€v+8ֻvQ"º]EAx:/}gv=M|y?A&Pu] -ԫ2趚Ʉ~wTFiAFF暠k2-PNOcY#emQJJe3\ W8}-% @܊%h- ^=i"wl'˕AKϬBFzSn#)r)S`#.&,A,-&gӪ0~md!7!;>'i(k.*2붐a.FaE1fkQ@$7'of&Eދ  @xw߿f&߿l>%ܧ3[{ ?nFeSˤO^&Z:nh$<ĞSߊ8CU#wMN +~Q"n.NmrgMW NMy1,AWz A9Isn~^^4|+r҉Ŗ+F7*s Gtp:@}cn%e)S+ PRj" [9xS R0cb)4wʪJxԦȬ|.|Wb/e:{l{4!A[{o ׶c4ӹ[?7Dek>u}qq5p57b5T}a4W+RZ-Tz)4m*;Kgö|Z_MtljRyeR>lo+vjz">2].#ՁBe')3nVƕqu kProdEfau" vߊzvvdW#AG// hUB*9)3"FtK<*BL 뙱1$i>&q'ɹxp`4M(S*Km&nl uQ{&un+m}Yw>Bܙ26o9ÄS{'[ps,LW{WqIecN#PgqstB?#;A'N"׆`:" &5pk+@n,h8tt*Ffz?\eG`Ta's5=ٯDw{.q|2249?)˂Xt7:Z}=:\d4$~O* Ͳ3{mvG׳]xIg#W&>d+Vh6;B {}f=մq8.&:u2]fںۆIϏww>[ܣ祖a4lot1;~ڏx1n."f6[p9y}IwkJ9%X0qr?~?1/#τ͹KTmaբ(췏wc JJ}J68S:SÉGJ@: wʎ{|ߗ%:8^Q#g}5 SfɽG:K9_3Աviϩm_b]6 HSI-jA"=&bznBSpX/Zo_7pL1žmΥ2qWt:s-o0,iSa{#@=MNIGun%7Ux ) B9$ʡTʡC PJʽAB9K|.\eq_fi캹O9\As%4eZ\B%cPe~ю+V3us%bޠZ.\eA(s]JV먹R*]4WJܪJ\b0J[ F#T1^ ƼTK`T9z%e. P`jY\.W*u_-KiI[t}E_a~L9ێ-_p}W(S=[GN,>$%$%![D9]H8)!P$(1T +nkID4JIXI&HNO8G/V RRxZ OKi)<-7VxZJWi)<-RxZ OKi)<-asj7»;H»ޟ%K71Y6xV΅tk\+[ WJ)VJO#s^d^Lwi!ym/rճ'|QzVt>^=KYWFJFnUw)<m57Brܴd4K->(;W%i QWPnE9nk)&]o?嶥a9Nj\]og-6DsQq1AX'-8Ciw0i9 oX,>ܠ i+`nߌpqd>?_Nsoػd4c7v١G 򞔳S{A~w4Mtnֿӿlmkzv{+{kvL]_.j MN'jG6:-rVZy!#C]u>y&Q5q]L>}> S+Bm PYXi cɵ&4U%i8sgZ: . )jaytI}2CD,q<<%gn>τxVltM#Ce?Mm i߻U.6N"))-hC=5r'(Rܖ%A& LpMq5 䴳FF `#VG#$%5E J ^2c<-ٽ7Y`Ԁ ]*q[jfL1͵jY Y8_}ԫznGp9]哥O*'$TFG,:a6##iq$2<8Zd-B __%cB J35\ENI\ҺhLLʠmDQ }%eqw㶧L#Òsѝ87s^OT)<@hח ` 8UKE8EȺZ]z ?B"p`'fpw,7 9֍0ɾ$x|aؑ7MQJ.4x N䊤SޣYF೽kM^U&k?)=W$Iݫӽ ܌}[!-H!4=nH?) cf嘹 =ب|_{h.xnKWM-t4O:o !f6pldفol[I$뛏2~)kL(9ˡ;ꫬ[No:WDYY#4+x40@{Eڣ֢3l[6ɬTw˞];,,D''6o+'/4:9@q K RR- XIbS)'/;)$^_PKe-\sṲ̈̀>]zvq/*ʃR uTx@43:E8 828mfKrq :A>U ҧڙ])Uɡ-L0گP1Fo5-[ZSm02U&C#{3]@z|GQ"/v]NY?(^G3> 4*#DII(F#K)/8a"xHZpͅ Ds\SN4iHT(6FgNϝV;D!Ran `H-1&Ξi睹b]g, Ak-^M̩ObVywu=Am'|]̠L¼aQ\p p!%Z|<h^ ߱%_u`Q~͠ΉK*oS%#UJTq-OpFAhq!DD^ݷ|X+*Ui.89bG¢1(LD06$q>0LI>!j 0y.,W@@?}SmM,$kmH_e`w~T=-8OGD$eGY_ IQJ5Hk ؒ9_WU֕єf3xIhH{e]|qRऑh[AZ+5!Z1)>;j?{\],MVU8=܍+ECΞ%`WS+>sU0 4jo/nFoW5ĐC;ҝ킣Xm=K`lHB, #/F,a- b(6ҧl61'7QyK.u:υd\p&#MF%_X}:&U໼'>ԜO`94}oX_fnOI7c e+*LriQpB/| `H̫DbE!F%tk%=iLp?ou6Hsxiɓ2m8C⍀/,S[g6^r *)ULSQwٽ&U yK!" G68QR; G䢡L3I5FBFΚM9`nsv?Ocx&>([$ $kI֋qCm#_$ϧ3c֊4hw^{մP]]ߛ;:=8-ju\LNGq {4@i{`^elF ح3椞w^jh mqNCg{H:y[kR8KNt+*ȬRӵ)%g2daW ˳{2NjHoxk: IZȃ B9d,[ %l)BV'7d`Ý[]JsQ4h~s'"vz-%`u4,*k|A5>@=ac/Vo$7(H߽l,[vm >_K B7d\vݒ hq9۲]%5Tmod;wд0@QSDF-+AfєdHDCK[qe=@umƵ]kPPס6q)SDZ+IjE`q! ^$وu;=S0Q,,rpR!8\$g29qwғM3YFޤz7D:%~zUĤ:"q!yI9KLAtD^!Kf)It7;utk4?`m+UMcMnV?B\DяrgraQ^\iyA$fnJ7%˴֜ CS(dhUXW ~N "p-Veot( ե<$JE%9ʣ$bT ^x;ީ5.4ӷi2- sX`?k0xkj-Br5mo<}@hܾi nfȰ)k}}r0]2\͸Vנi[u҉{SlsS#pR9zȬX )Qȳل,w{@p-EUt[t: @B*"P{)0i> 芜c$mKK.\4 Dz/J|}^/oO"&{oUwQiE%WH_5s}>O ɫC=צc1gVTMQ֤Yex)j D+Y13BITJgȓ技Dd*3MFt'NUE xB _w-RER<7X,*@ISoIu!J7H 15:cuu,|֬+r)r 9Ska5A M(*'l(C&F LgVѡx̕.)VIǼrXnng9_AF('>䉂MbR=XʆױRE1ڑP0Un렁T} ?|&DFiEȘ^3iQt(s@#7f>X'&gKI® UhyHo}Nŏ\ .z)lWwM;Q$lVhuѪs0cv<<™-7_>r dA:0H]1FNDꠂPTmo}Uݣ27敶gG F9Q2Z[\ .AB霹,WܥsȜq_A=w-wK, 7Mt v2}T0IE` ˴̶XDb-B%2}d1EpUbkઐkܶdB%w=\=CΤ&? m \r**•-+"?:\rl \jAo:\*uLJs6U*亭 jW8 \n,l1H;=B.l1X5]*\Y.N5"D"jAap pc"oO'ns?͑YS<@M|۩}78t?UmPbv ~lHKDD9;58#g5\nUʂ6[ 5w3ح~"=M (9Z8П̀if@HL'Fw_s+J ko =v-t,[;&,HEa$k]Tn$fKg HW#%[r%]P7Pim=C] pf 3r*ڍ?g$*BaWWzť׻L20pu'<'n'vߍZǟD w+Kox¸rz o|V9H2jL3,oosa3=G%ah8u3c G`3יZ\8V;HaUwח}Gܟ|۟l3+J?{ƍP\`w@u-Z4Y,&0Ȓ"ɱ݋{8#FFKHN9sHJOb .85$Z (-owUQ=>jςPM F5z9 ^V=jR9"ziQ@R:y9F('K{3V^xQx tK)v篜S{ r̿1&'jKAFJjt.|UیU5X?l=QӋPFt95덞GsZBEQJ>¬dVS򺛻CB$f^[֎}Ю/=q+*vJʉKkpͰnZXWȳ,)+"WGˎ\\[F$s̍&Z[x6T::!וfe%BÁ_+!|f<[8={u5+A<$^thK;nmVOQi"K2UoKZ$*e\{Ck?oצ(jRSfBvmRђ';2(T(C#^F.dYA$gܤ\ԉLc=\-xxzyhCAvm^-t[%SeqD+ 0ghV.Bevж_)RrE݆;kE>Ӓ)E 0ó:Vtc̱l2bu:74TyN!"pˉs\n'Ej a`4ݩs9͏4( %uxhs/ s+>oqS~>]awy^A}_D"#yұ*Iy,>OCyu@p,+>-vaM}ɍ駌N+1Cs+c@Π8T6D{h]LJM S? |LhOU M/}7n14 V8&t ,"Ք¢ q=n Y>6EBOȺcS\\_EmFb|6Fv{fhrc` N+~4.vWcqAZšIJrZV|.v pˁ+"vX :\%);Ջ++uucРQ%9heۡ70}qv 3SYy~YuooiMʳ}r/G2__E^e|^v[I \ ^ظ~C3WkX:[A-,fgϗ/m|^ZR\ǝsMt? ] }?t`WjW ܟv>|(VOœya"ݛj>ޚcn#@]3n}bg(Nk|m[cR;ĬW逰 9HysQ be{ra~|F, &=IyUx S7v1̫Iq[9oC\KÆNoREP)4'/oC'8߇ɡ/?]:i2Wp*".YB. oٸT*GZ>3`1E[s foN.)j_\2sȧ24d+ƹY e^+:f2oP\cKh.*TѸ2Po*O]?BT>]|7t$1KL]׃N.٬D2`&gXзJq'/dg"^7SA޸Zڧ0ƇN juѹH_T?\Ma7]-|'YyKF,ogL1'=Ra) ֈV6E|(T0A:)La"$#8cL9γEhI >ݜ4 kͩT[ [=\Un_lƾgFlN!C&cEG=i=JυtklmGYdU[R<9&r6ȹQjE#aREbHj)ޞY"+ t+UD,+\.qќ',Rulj?R}]3ej uSJ\VU |gFTh)⚊+yj4Z j]D1C1)r`[Qs 2^n&YnomzMH=ƥdv=~_0bQG"`"RSFDD b FрG!eLDQпjԼDL4uKL>n6~F;5=ZYK2a^!X&HSE57V"FMsFV;|zI\\v.5p[ORمp*B{ran݅}p͊3##L, 1t&t-ށTr4yCvA"]֮ϹVQsgc[5.F( cypQ  sl8SkRaHVHXG"\JlB Oc*5(K)µiނwȳw*`%n>Uv&I|M jܪwG띢*]*!SE'?-Le3gh))"Xs9č2-qK1bL 8y" vbpV3p5{怒@]2׻M7-j0TxGDJNcRɜRafXTq*"CdBcPCzXJ!= z2VmM_Z N'qTPE&E'R1nES!dry'"wl`ك@ਤAz|2v nggÅDŽUۏWo2GY/t$?o-#.tWmqo7{"B$y\ Hup 1 Iq}H/%c H1qm!ٓ<Ԓ#(eģ,9)5N'hV "0PDΆbxnll<9 =MV@P& -,IVǒsqtH]E?/];? ~z5M349L{ zor$8 ϫe\ Ozܧ߂*g?jGgSEPٴQx s5L!3*k: m)^OaF5Vʂ`kzт"+~Apy!ߣSui$s|'JJ`:łKK9 H&umuhM:P[q65Cu%h ON @2r#`8s`at40=]o#qWt < |8q>8%:>!)nSv烴ZNW@ܠOEVC [E=>l=r' &#n~`mAя.',b=Wg> _}W4q#41EMN&*V 9%Q t2XkV> moW X% &wf-MJd. v{ 3K53G 0ؑG#ۑ'bYϗ֯ZvB*2R:[R`ION,e ,Dvs)>0r)W0#~/mo./Z|1qѡWp\k)EaYAE8o[A#G#^ߧ{P bx>೓eBuk[E6 39T$ Rq NkV74& 1Z,B,U,p 6jMtp $w^Y1T#g>A<;=^t}w3B~m1igku>M~j`Czg|,#g߸IA(=OTMDpLX!0KJh`$:jPFAc91Er4OS W}iOD)^2h3$2 )&{Fi9bPF/d&'~Mh#]D= ն.ni`Y'Y /PEA l.W〉-u$h} RBeJYM:J2tFҡ`?@T=mv;o9ef7_Pi{(O'ףK G*ϛ٢ nβ+6T["|hXҫ$V$<wu "|Zovq>{ iUqvsyѾFx:o> £O"j$8v2z4nŷ/rEnד;wsp/@y1M.7h퍯js?A]徉̗@Yn3g 2g_C }ooJ.N)ui^+u;&*Yo=Op ;:?$}|k_4.q\"!OZb#s\̘Eqݖ|vdoYG&39CsPԙkwNƳsn9ޓ.`k:Q!xd"˜FfIhLOx @-,*}r ݰɼS׽f"jx4͡n;g+9h86]9ytW9 X@R`Jfȥ`=7S,!f#y@X"*U8iwq“ ,3> +rN |Ni4AQzC>ofi~{s*ukR:i؞3}P*IC eHKe!Mf'k}t2dJ7Bh20fSR ?x+ wT~'թԜ:Z<㩞j_-gSZ ҇QK3^LDuIbXMc"^`%s;dd܉o;E~i2f=X 0p:0lRI3aL*" g*9@Ԫj;i%}ħ(LmtkY*3BqO֚28$BT ǏhPGqRG{x⠣=NG{ F{5UUB;.C^q-UBâM9(I2)Z m<"D45%^T*|N,{X dQb\BgDJh+%)ɵZ=&̀=bWe(][є:mލ?b2E- 7;xP"-v2%RގGhg]z3ܑ^:ooW[!?h!|+ dB̍%ӦAd!,#!lwbn3a+r#Tf-mǼaIc<8xiMJ*x#'IN׭;'J6Y`裱&I#6*fC&CJ:&ʖh5rzO!Rꗭam-]5ަϷT_~>4~zlWl`̽1g [Tg9 :mb6JKp⣮i{c'@P<^Mr"0I@P 6KrPR%}Vz[#XPKkR2D'V7LAt6bL$9,A@Ia"^Ԫ~̳J^}RiT|tчi<Ӹg~=!v46-r}}~n>O>{#ŇZW+j]=rAgPOz?Ɏ^IzqN.ٿb5Uv@n;^I@AroKY S}4hQq@x3oxd?I{2Xe7IJL"$6IR duJ BT!8-oN YgYa3/esf3,CG@9Y2hrt1JU!gb*ۡoAv wO=Cw v7PC|;7/de}Wv2I-_S2lʻ 9 31AlCۊkNSy0eQǒsU>}ZU\9#D;$^&jc=prJ6ClBE F:! `U<'@)+9KQ\(V#gn^> }P_,1x[Kk|s#T=ʍ;7^ {ԋyYh]Ҵ|_F-nh1xy2*0*QsGgG&z- "1t&uT$UkR u[IH \邰.E% f1+D8 MDKqk;򪑳B@G][t9 @}| A-.-i>~UL$TB l U <s$NC'Khw9hp/q@rd|D&y/SY1tJ.%%3/5 ,E |  xf1ZA.{gҫRx2F~X:}Jot=Ӫvh2[׍{L㢫dqgv΂mv.ԓy;goY ="6+ѶFs ۫4*'2'7],芝Ikٻzb\i- ѝw90L̷i{d% ~G'NjP_{-ukV|}ڷ5k}cǹ'jOv(V6/e[)kjr-{-}Z * }>6'x\qj4ڨB :Ԩ㽖F' Ƨr㌉3P q6r&Kr\eVq<9#*jlzl>~:!c'\^q"ԟ^6%Ah1+Xs4 !det"~6s)L5$UIZY9P) 3&9W @}=yNTtkGjxzvj{W2\xEBy=6"װWj+rsSK=f›ޛW*.W?C_l]y[.wWd`/dGCK u_&p%\]kE*62Sn1ͫնzs'ś^r34Ez3V%eWTz]v f 綥t\8W?^QN~u{m''6Wm3DγGkSqAZ j 7h5MZVc&V;BO1,R.L9h)th cQΔv*>rk /K~p酏mx5Lt:F)ӂ +CW@kS<%BW_N˓͍TE>k%05AIWkC68V:?s nwez;?_\_zu}}'OD֗ߝ]/u ~Z2Ϝ$< _؏]D͟Omn.~?x͟V{6.7VO7?=xtH2ϙ=9ffmtL3_ƚ<Ѫ!z:"3;Uoӷk)aYUmudד5Lㄯg1?]LXeA j/ õn)-6?GQbG\RZst(:BѹZ ]1\RK+F (BWGCW^܂^ ]QW@\:tʨy1UaAtŀ[ ]1,ڨBW0]}@%Ew\e#G)fV?9=8~Ծ3x{ׄ2.L y͛Sn/p5JŦ>/.ncMh޼n4y/'U.W>?O7hM1ƫw3򣔾gߝڑp .I47ĽUL]l-aQU@Pݛ N!/@$(e"3p@Onϱ^xhJd^}$~Z!6r$Գ8bG>ږnh_yJz]>fΖv}{"!O^x}odފ컫m9p}rop){3J%=|K()'GT|q:+gIdRtLד>?B:vZ*Acu֪jqƅ\J3Yi|O1w:N vVwm%y'ZJQ+ωvD6%K2c j*pb)Uf뀬-PRJk]reGCwHfL0kD34p'-Ռ iTUiԒ{"Z#)ԯmu h nJAnx-d,4D2R  cr61b5)x\wUa}pxĨt # Aun\RHo_\ǗbjSŕuY(/`TJwC4*ܽOIzM5?"QJj MQ Ar#钍5 :9&F7'ߓֈscO昑((E?cA(W&ΠPR@j%!BRXoݸTI'.V[ZQW!4둇 -U9[0&aE6 VA`fY'gTGh]uTV@N`NڨP/:THdFG>rN jSNcG3PQCm>+h-i~[>83 AL7XuC \Ő ǖȭlLE"n$l CKȆ4<2a╤Ӝ`AV5:(*-w@iLb oDXqvQL&j! W۞dWN#w|эP!ՠw^KCp2PhSz"(0,JLhW4 ~DUuASзbt,r! Db$ۃB6jZ8b!z_4yZȠό>@|rnf R5٠dc̠JUY!F#NHȿ2>`x &qs$i|2oTUd=2?8;5͈ Lc{T^%y]m jmlAhqW +0똆7mqf .' ) >@(M&jZ#T^SehX JR|yLPBb2$kuk+< o 1t|>¬JrCџhzB"dxaMZ }Y QHkCtqz{"gyQ'kWمX#WNb oBfȈhP2ǠA]KrנmAJ<.J3nDihʠv0FZ%\)䌎 Y)֎ a:y@1 bu5;XfՌ.8Xx;:B= 3`2Бx{6LE1YIbeҪ*QZ|MC@;3 =\oFʰǂp]# PBЌmډB@:Κ&Ѡtf5áRk4GoMu^I"`!-vH¦lpݢ> f\%C渴˾L <^oW$7i/N zL-A!nn8c`3 =k(}Ϊ2YUa6f4Yd0-7n(ߟflֳʌ4IMG I'? 9k9j~Bg ڛ/`gMJTCۊJv0X4 $SSAv =>|(ʊPuz ;fc_XA|⯻?uŠxMR'w d#+Cɟ&o Q0 bp\ 1U -N}7#ᦫ^:H9-0 ᧀFSaМ6nmVdnQ4 C5kՃ*H]6 |h-@LсX BVcږzQG&&÷5@obT:ڡb*fM֔70hlCkF3^F\ ?RčrXzGIzss|\R- .!"˱XTZI EH 8iBe. V3P5wOW,"$=8`j?&lzBZr~~s6C,@IE.Cu|gU[r;ӷoFD:0ք^TZ#$>rM]fy}fm6[v%= {HWO%9L'h\Cq-yqxU8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@BrA l9gm<3G:F'QH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 tN  r@/ 7@\@GJ78 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@GJ/ F8Kq1Z Vq(K)H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 to#>W^Ƚ[j?ּ,]֙Pz{Xt$tԩֹ$`ob:Z ]1ڨ)Ђؤrb}t(c:BZb>-Nj8]Y2J|3(w]:"^J`DWN!bv1Ѻxt(:BrW}r\`a1tpmX ]1Z P:3.E+k 2/OW˭c\aAtŀr20ܔBW@s$Ypt]R_u]s [\i֛ptaL+'ٸ `sb]1J/ qSokw/pn^v8N~(> V=^+t>&$b-OJ٥5Q ]!])ϻ{g۸EJu{$A3FaXUlkږIvA}(%b5N6o{Ud)ޛq'K,cpvh:zMpf"$aތ'φm2\MgeTaz\^?_ʓgǕLWM(@զl™fo[80vv?A=185Np.檨Q@ox%>)'t#ozm9ʪj/Y Ցxy݋>NܨmA2y gUf](2\3!Q݇ksZ"3 Nٚo1o>Μݙz@ñ,5\HE OlfÙu4GYY YiqT%yV,j)2o s2Z\TXB`;]m:= PJjySlT)ub0"Y80˓ Zt\q-ߋvj.$Puoje,\ZIr2᪅rxkƃMW(W'+P+ i:PekU{p0%+|)ɵ$P-Mlmĕ6PN Z @-eBjo#ܦ42EB](W%]Z TʮVAX:}W(WTpjuQ/WtOZi{v\'=)WӦZj͹멤 Ki \W҄d ֓E* U2᪅AGdprIWBlq*y]W'_뤸Tdpr^OjQ᪅PR23A4W(oS+Tq*pB\)Ʈ'tA+e*B6WR4lmWσ+HUHidsNtr9IWV6Uj᪅2*2 32ji:@`+q%j >8 j\_Op \ox7ǸR:bjN>~_ƣo]Myp=<>3e.u9,o;>bzVrh lGG9*}{} Zvћ~{Peex3b8H#\/ڝJpea ?*D-ϩ%ʣ `"\+NGstzDUoå0ZB~P I-|8??A:*mSrSmە^];UU[ִ'\Գy~1݀愦3r)MԲV{|*b/&Vc{ F:={jYK.gZτZ*E^:e5p:\jzʌ,!\`u2BgV U6lWσ+U [kfW<\'u>W ᪍Trj^ -ѫy+4^ ^a,ojxlϱvKp%fS+3HgledZ3rIgY-;6z%g:tRl+=Au=6=Flc-$+loTpj5o:PZ+))`I2B錷ZDq*M6J !0'Y=&dZ @P%pF\iaJLE:B*dƿ| *Y?U{pe$T+z* @jGUqe%$)]`T2B"APi+T;\}9{4zչg3EWT2,\5=U2dprJWV6WRW-SҔ+T2B+!Buj!RM`+}ՠWkz*pB\ XVڮ%}zrM2\kASɻ6J˓jGةprD5Wivz\)l>q'زdp̀}"\Z.+T)6J[-s-+d*B}WRwU+qe F$+jG*APi:P%^drp%4•<2XK6j%ל{}zj ]R6gQWաD[%zrJW֒ T¿Z+F)I*dpr jx\Jc:\WkLY2V helK4IeeX%Oo+wLg1ZFN9ui8T2&߹l_M9FRlߝ*FbZ#bs* e][PiJ=(ذdpdpji: e צ,!\`oEF2#MdU qT&+  W(&zUBUq\%+h:B+TYqmwF\N4' p*ܫ7Rkm: &uB.R껲|:/߂\i,&ڍ TZu9{^B?%؜=%מ{jLRod \W UJdprKWV U᪅bB>j4\\ Ph&W_Xz&`V;|XU}0?/*z(awn`Q ^B_%W9-|`_e"`p*?g,RGl~ 8. /<Ås^}p!k7;EFYoo;Ϳk@L7dB[If^,g-,7 /Wy@;Ev_=BudBѸ,WB #3(.ҿ㿿?L%5< II(qŵgwyx1\]>ưvn@Op{ϯe-d}{7F;Ꮯqɡ7_?5@`>;.+(\ eUtTxύ"p0yb.Lkkk~> FJ3YBNMv'$]:&EsJ=y0Hϣ sk"&F+!&orhcQǮ>u\mipk=dm;ܵH*M=Hjg[mѭ XHqm7kM&YΙ%LߢKHS'晻7|n{ˢYNiaϡuNt!J)"QB(B(kSR!NmO5$zʇ t,HקYj֜I-$gJ: SCf <;q|M_:a9 ȰS@q١z~{Q*c法W8ZT`Ey-sA4epK02%… 9J0»{ʻL6 a߲4eq5n=v1Q{7j@roJdߌf)$6hﲛ7 ¾PB3!͠YqYPN(|ѥ22eur !Ǿ)=ns>ҵsiy"Er\(ӹÔA-7)O ͵`!'$IE[8f5LN#zjUt6 rpJFbP8qx29{r҇ULT)VavzL,L۸ 7;˻wO;X-è25t}Q[C78g5qOhƐ7Cqe0p~W+3`k53ū\ʄWaıjEg* }+#2B !О7}m7ѨM_HՔ@-sS>̧v.Eʹ\jG/WT(7<6B)$c0Fȣ:iDIt2Z AI%TGG[B,cEa)WSUD'ژprM#_B2*@QgЦda4|xYrTx4۩Ej1(/>Q^6;bQ\_hAQC *$ј Jl B@&Q沫'Ð&`vճS-DFU#UVq) (3iUzbƞ9R8g :gJVfZ[• H#)L[)qLΞWj .S "w:+ 2By$1P혹OgzI_㮁Έ]za^,4bVϯw$]-b2+*Lu&pY{{P9 o@/ +>R։DTQ(*EgՊtg?_z銓(qɗ`9 Rc`dGv!iP $X2"DJ$8GWlrD'%<?u|*ĽoENmM'-~q޲ Z9-U"TVdAج&n$zM݉v=] ݟ5SC56u-=C%`HeJFb|eΘs36Q hhwX''5ҐD[X;kxPD<[ VCO5uؓ#Q#N Cɶ`}`%i_hr3:3l2#?Ν!D3hʴE~3BG ^PFF $ QljF:i5};RAf.IDdH eR⹜( %z+}۶o0ȤkHhH1 6K=:0{+3: >pfv|h= ݜ EO ?qzuڐKfwަ냪ˑe֮Q$GEk!˜ͱ)"R`*> sYhZ#$G]9v+3懩ˌԯSRV&oDMqslए9˂[NH .jD}G|_Hi4V jbJy;b'o ! [w7+Wl\c"#YʦzS&2"PFLȤL*(Q:%*jusWtaHE8k%bFڛk&no) RF끑eu]Ϛ9ݑN]ж}h,^;5x~\r}Ekw.12q{/V)12)>%fPbGж9Sv=⏿.t߃nʢFw:Б̢sѩN` H+/'Ty Nb&$(019d^(,Qu<.Ҷ 39*XN,_,@ C>TL) K_w;tMjLX;%6W=4 dX ͫB|}UͼAԶ5}I1%aB Z QBWZʁ >eu$dnwl !%TX l?y3הl˛?/{nв4swqMژ:=X٫f{mnڿAb)FFϞ:Iz j[Dm9HXlk6:P%@;gIfޕ!і̵whʷ8-Y?qE]6akSyN<+9CL3MZ/րG*! ],;*]P:g]0##y$2|q)' G.A{ެTZA+}&jk]H*Cq͋M锜1 \V}3傑N<˞(g_n(5t3ᎈD"/k"BB!{PzF * &Bx€A+5awPRTdji^0or^{~vm7KT.eQ tĜOR0D*(tR`d'4>-`bZ|0E=Oi3 gCN&)k'ئ@^Aa)$I%X㎕>9Qޟ?)'A;#LJHBe2xcNyǠuj$PTl:FXlv_P]B)j.nםī<xƁ2VN&1DIGtBkgb B)OR 0~9=Nw&ny O3e|2;7V~c>|VfogWvaqTq̾Y5|o0%HІj+r "G _Z7o:e2?__VBäA*NW?<.E;h%ϮD cJ.8C#RHej-fƏ6\ :, ߟ\s -[=5=tzgz^prxr\"Z}ԵxD:rn5Q[} U>g]3lzbDu蔸qJהTwӴ^0Bk~/ɢGЁJ4N&ӿU Jܦ{:8/+YVEH*'z|9,S]ӳ{wqdVahVݳqccҵlM'YUu 7h5jz r7IjK5[R-ZSSXw< U'É͡'6;9֏ vW#Sw]"N[ C5*H^\[OTrK"j+?v;*O[38aCiNE/^B!/1 $ z6W50驔!lU[sa{TS>O2)C49VY'GrN)6ȑvNb\ Q(%A7fȑ>N4vI[|$8AyS:{!?}9,VK-צGW`= ivF7vmz ASRrLa@i) KI:M-y$|t= EϷ9y' (r2g0{яBh}=N_7a&glM BHG^ZtǬ2,r;ī\bҤ(R7MGU"zuy򺇤yŰOFyU;UΪ@T")ם+}%g:HEdYc=kO#IėٕLH{G7}Z5=i5|60YU6\e v3"#3"#"##\_lt-jC~ u4J￾b3L2e: :L%fhkc)L0Vq]wfWY10 f1grHxuE jY&,ot{!%Z@rFϵ,wJÃ&5 k>Ւ]UіuDF2y@ %b=-B0- c}s.iyD6\ z#**,9AOӣhy o5K!Ab9.J!#!@#ҁ c8So&aHVbrE؄ A2 c*5(K)µDX/5^#vY?][8+gtI尵+=) #<,/S2E5L˃ l. URD!n.qn)F $GB:XXiAY" =sw3}$P'5i#`RI.(fEwZ8"` (O x>2m+{)? E(Bdoc y[32_Z M TqTPE&A&Jy+6 7Lb~O~ִ #q\ wE`UD r*G bgvicVJzYNjjtVG5gTcqQGLwWHS|}\3ұt\qVB P>MSbG3~\&OJ[y:=ZuE,{dLfuuc "E3@ )(uhm3gJ`Jftb !|t5\Ktx+ɐëp L)Hj*eMp|=8(bT(ZIOQqms{%l!ax~J+ɳY<]z|p|3?8U "smnEu/.+R}OG,͸94ŔPc5A"!]7!`j3XN}( VY; G01P'l8VM=RPLH/aÿY8{j~M%N-g٧¿\u7Qq)2lnڊ".|'_ɪ9;\Zcق XWCu$?}~x~u~ӏǰ2N :7Y k!ס<к-o*hk/!7\~ڗ"D"R.-7Y~Y#^pCIRg`0f +h$ ȱ6F|Yk<"PzmD%^-L &iJqIL#V#0F![/ L,;N+;[uѹ5EgyDUΖ_*7F?tW&^1x0aSߝ'bW!I3eUjyWNzA QJM&)RO?[ a.>XWJ )A+X]^Z aNdÒWŽ7+e  ΤIDtl v98~b2P>H[RSJBx6CcqVekErV\ [F˰m/Wr/Vz4hUBXɝW ]vE\mW K"GWN==,D'X EEI*Dfj$g~*K.Iuŧw0w”^zV3mC8&eBh,Hh*dd ' 6qnEd 3m AO.vt~^7W2z $pZ`lӜdsk,QW+Lɍ\6ug pي- n^fUPzeR2D{łD&Uٳh5f\"g%##bV/#Q1G E:2,")#a *R)$ꋭeRhϒծ4ib簾;{ ޗ;WfJ#o5ѢFW^hTW~{͊F\8gsFʵ<*Ns |DSFwqK)La>u /u0.5~4 QU(]^Ԩh׬CZONmI!ݵYv-sh׫unI'.La/otq1/E\.н׶3*G^i4'ß4]6nMkݎ`wbnmRL9YmomN@}ۼCx[r\5LrɴB0rId`Sk%7V201 pd ,,xooz-;fD Y+ҾWڃ`2"ERyBPD^kj5·F{~Cԋ1"))V2""&ZH0 1閭YzZ{E|+\uSI״E7X_GYU[YPq Y"i5H.R9Ll)tֳ֑ DL%T8 yS9WƠ}HJ80%7?ds&-WJ ޮ8"[]U^a.gfRZj+ݫToRq"n]PiBR^ڂDHG q<׹r1:ђp(SPXQQҞD sojD[4e6Z: +#t}zvo>W6e])OJ*T k) T@~i}0}{7WХ} aaçuZ##L< z1XH-^جoNGTo&"~dVs..hYFGyʼٕ( `]h΃GB0%Fd-rp/Ιb\{HDZC KTR ۗZ;[&S>aqZ&s[g ő+ׂ8E4ۗB~RPJٝ=YuߌYn97 4]6pt,F`L(0}3Yëp L)Hj*eMp|=8(bT(踥ZIOQqms{%l!ax~J+ɳY8_*gseO6"m팣Zf\F\bJ^(1 f0 5],>RFN/zwBavEc*ANq{&0?tȯCY:,۩'`wTRz}=.ȥ G $^M[QZą+Ш?Q.H+ UJbg;&oN^O瘨~<'qp)(|^  ɧmxT%@VG]|/ u׾! t&t|[9W9b_ muB0F( %I 9%1t: B, Vt?)QD-k{&%/m:Bx Xov(D4b2M)16pjH0xeBQ LL`Q \/u6&8ВyDտOl+E/;O[;[5=Un z46Gڇ--opqE^J!(ڜ?{׶HrdEc;#3 ̃ { `}Y0*M]FTg5)Hl403UqDVDO#JWK}aNhQ!~r7_X9rnvci罊drJ;ۑi=b(3kjxǑ^iɬ<~ZOaz:e4$Q2`F,+t*t(H&St\3J0"i0J:(A ZTig Sk~nU96au2TR)*!p+.RɠtRض;#Rl"0b)[C+I#ČJ9C0֊[bcGm6ݎOofjLK}Y!7v7YɷI*e^TGۦݺXkA^MBʥGoa(dUT( h%p"RQ zrti |N=[U0hA526ndlUaa38 Uc,# 7ĺn?VfӼxB/x;L>MΧir""R%+ y(0&)o|Y>T۾BNP= A*YaS JbıpT$#)*h]BeZ 1ͬc(F6gRFlENYGTP@܂J#9@B{$I Rv64 G 9*s(S7f#\^$"zMg7΂@ơ b38ucDGDܚ' 2I0(B .PԒ)arZ41D8O`j+M096`UH!:&*X g%Z-G2ֹͦ?e~ԁqq^g3+9Mc\#.Q@H&d6jp>)(eU H0^RJHIُx \ 6C6_4V-Dkez/fk^UXoK)~lGzw>O>_\]vwnA:+/ :昪CDG^sBfrNҘ9*]JU CDLĎG@^6SN{̐dLi~ߢsQeoxxٳowxmwS?ޞ͏K`=)/5~ @'P)S6C@մK,8Vmu9%d 9rt׎ҕZC4W(ZCi6(z~|suu[ͬ{5 l ()f\1FYZi²)A=7/ H-EC ]:rgS#őrqrK{j˼{>n6JiEA[|FO5-E(QP Ƈ)ʘ|‰.C2q7!3 19=/Lom?"r^otw/ 5L Dd.F$U쥫JYI@J,88BlՖf< x}A!&ֺyCLT%Z),AmJXdNSbRDT¡ys*#d `hUx[I7n` )8ozʽۺsy7e&nm|aψ'rQw=sy-3ݧxrTƗ eIV^(RkХVh Ahk[L0#۳4BO/ڐHRI)k0Yi} %U*G>BaI1ĸO$J Tޙ Ar 9^GE {s^lX(%a4#87jgκI-z|v_т8T2kRO"E#{|`Od#ī/z8M|a0ˊ&!&JƐeB[#h%x%!EO͞f78\fLr꓿]j͞ ?nqr/oӫws|3}W|嗙*KY'!ɹ(m6e"?_S@O\yx ?'~@|59_7=?]Lr&u=;BŻM<'a-u7+Z3@#_z1|WsǟBOzu?:O{pB.vݡ} 7oƉh=51ڬYYR뾁 e3W@5gny'U/G{b}CJ}萲wHא}7'  Hg)z%9f>pg_6$.؛tg?KЖsI5q^0f^fV#ϩ^3ezwv),B-3ZkGs] -<E5=7hqnQdI9bJЦs3qN65/[N ' diIwlݱYܹYZn46ݶ99t7%" V&i5ڡJmX/U.z4;/M6Чү/S3l};@…4HG/^B ^3AH|4ڀ^XNA7Җ\3 kJT~X!L$᱔GfY5v'2X$:Et$^Fvv?R<'LEҲ++z/VL=ƦC~uf[B鋫w-8 A!/ 1*A)H`*_BNU4)P.Ǒ䅛-1{0"Ў!0ZX< u<JH^Bc]0;!r?X9 ݑ1(&ok3Υ?leS0 cH}BLmtJʖAkQ E@vhsV l4ZD1  $IieFb@ <*@'l:{3 M 2x}9/<(N#[k?[s&na̦*F6EMT@{5V%",dW+Q`J^9('DluУ6 bMb6 Ezr1V3o'c4I+v!%Kڊ\S̔5;>2Id@3Ȓ.BP p *;Mh+2;q57Q xe:APNPֆBfEFD DсpYg|U|miv;"<7Cqf&rQc磅%" B 0Ee̚Hx&*6 z cEV XZYȁށw-n~օ5I7Y>~7eF$j2!aL6ydmH&ˇAc-bmit\Cw S>H~I^Kl>(yfo)jr +K^H.E2z#d.gC0IPZ+m;'gޫZaHPࡨ-J+"f.1fؔyN.4vͦvJr@Ԃ mQ g,;t2 u^t44K%>n= wq3#/ӫI^o?'O/;b-ɐ.S%uEj"`YR#ԶbuHX#y^BI&1ymq0AYJ2t)UD}QM}lh[},m}ʇ|R77D'-WO&]lKչ_?S|hqO+RC,bkwPLyﷵX4mSvV&W67oq+HF{ 3\R=(^\/9+!Yz&srm6mN}4%r7f^IWnW'W3G]sd><)|o*ݷ\,\ppv յV|r?O"`+Bo޿ۛ3WߪVo !IBzb)V[`A?{q Fꪾfc /Y,*1K2Ii~D]Dg[{뫮 X8iVt>X:͸elB)[ڴVjfooO'\ }8tGc%p]D,dIB̧6j &c"v,6VKcБ}!F ^Jx:v*,-,'.}ADhy֎Gi^ ;HnRl.#ڤ& }t^`BL cFDilr%g5wL;rsÞ޹aܰn5CP[.֓51b,qk@FTsQ;ԭǥ\Ke)}I&Y"EȜ`1!#ҌyMc+!oK,^>>jGMl}혾i|&+."dR,NqMbͤrc#R-qG5vڸc-ţ=^%`/ЦMIV2w8&ׄwK @6RX 13ν0(au)4(GE.4m 3x2\ tKM:ˢ]լ9[68=*8q$a!Ha0 ޒ bB1e$S:D#2!W0"d2G& M&fdrݲf[\?Lj-yBm ‰l,!CkPM]NhƩ{Aceƅ"RkTO mKHi}hlIL2J׊^t>烐Nh``z]C/u4m#HR&CqEC,Q: >MZ8{A+HO|K qgS!  Ic=Y'e{RnY5|'Y7>_?bHÔ, GqjA wb`OSNDdy>0N]H~W|0rY}2_lyL|5} ()tpH+΁7?-K}*\mx; K;.U$]~zɷᄏ^a ;fahw&ف Z4~Fg1!iڔZDWꏃkZ>=X<^܌cmO__7! yẺhxt[nvQD6lކe?v;gNifY> 4:vb9'jgsOMsVdg]-kεZTB2+i%#ebIdg3_yP?{Pb:H?ߞE zeY *;{;O|%ӞrZt bǮc^'(2Ҽk4qh5sgUc]M2B>zh~͗5>OFܶyB&U2[ZZ1-[Z'g#pqqqʻQe}Y^Lm++F;)=oH15G/ŷe)"!evd!gdZ1fd D@sf$3N:KIɌocp["‰Vg9YSڣ줫FΖ*+W:=~;نkB`PZ$LPI&:q s6=!P78 K4(lCp^B.ҤBD#(_53vZlh6$4MKm7ܫIV=ȐϧMx.[;);}ƾyrl{~<c4e7@Fɗ+K6VZ"fTG :h4Yq"D#mRN%C?#g"P|(mC{9ɕJx%FKa;flFªLPmjƩEOq b֮y{ vǓQfI 2bC`r(Fl2E ,&#,v 2Y EԎF$jOzq,k|ոzQU֋׋^b'bK3WA1)VD]#h9 H׋ЋqSxѼVF"Env9ܪ_¸06q >qS]#pLN\8&l}Ozxѥɸ998W41y#0B0L}dߪoհ,X7Fd,, n2,:&"d bDSVad:UQQ:[D29٘\&r*N`r'0svu5r zrbѨN|Iem7onwbNjggK俳Cʃ 9k&8'>X{ 0`)mX]6@ɔR֒J\"eW*&ɁDCPܑPjߡT#a x:sWެ|V61avYHв:ܦ}sq6-{#{#qqk->nVK%YF/. gKX , i(S*sq1<2KQ77>O\h;.̈O6#I<Y[s~oZOGpBx2$[ 0\+iY&D*fuC@ IBdd!p &H89*΃ E*1BnyX1}_t{ѧ3>%F?Z;i麗>mlXc=f>O \1iR5QH/crV\xP\(M$ʫ^Q<8Sk<-kJ\2#Hوg@b 0.r !G;hY橢y¨ѭڙpOTК8|e>@!xB:CQr",^v^I#oi!5 b =Dԣn&,"+K5xCw$|Nݑ Cgd@I{e-b X8%eWOPOIzQG!zo10HI)e5!W: BA#8Τt2un80.LQ _^?ٷ< >V8hhr՘_^]#,.a*-zB~Ċ$,{<7:=vpWw?!PNFpFlx4;X :m*ђ5"k^JܻɴVW+9v :8e/<-e{ӓɹucjʓ?zk [ qVÛ /#64}i[ޘn3_c%~iw d7di\]Tb½̇|VA#t!Q>uHo5dv #zwԒd-9F<] 'wF+q'vsZ "7.nJx8혦 3#x$,znGƳƼVb:;&iYcOޱA˧fدgv»?EWGrD+xѼSu 57ސέ<*zJc Xr` KB-ǐK1x4S,Vl WszgQ8©tU:xw&ȵX#E\v;% >ӠzWG ٫xKk|d؜| (4٨}+edrW]=Wfx7驨4Z(}#CfM`(wO}R2h~SdRߗk>U6 }k^ ۙ7K%\| /1ѢRYDtw+-|ft1ؖsx.͘Xhf%֗77TbF7 rLY4dU9iY$Ō ͩ4}0g3TA83ݻ4<>3dZ].* .xHZtUR.Zm'\;M^Z-y4B 'BFn qLHR@Y`6[$5I 毖Yޙ:Ά|Vէ-Pn!r+xi6Ayܕ*:ѾGR1,rC[JXI%YXnXKB A~j #s; 0Md]JJE AeuDJ6]RVgP+;`'5̻QIQN(ðI-YkJb}$ȓ(Tf,% DpUtbg4Ut) F'6F餙9$ DI㈬*Z瑨)GrfQzHZzM:p[wO=De>1mTN X"E@!gq-&1H%JD9I6A!3!{4`޾+ ْ;uD'Ŵ(u镪~h12AKJ $J2 l4& `4~P)K2db,w"W+GX`"mૠ B>CfAD6op&K&R3HcP(k&J&au A+|aHty^AH1{e@~9B6!fBNnҘlu6ͣcr{tunQ._=fw;<2 Z݇Ow0VsVp[=:x0tDjb:ռ1.ʪ@&rY:g]9H` 9A%UAWeD A{@Inon{:ams#@O;JG F9Q2Z[ xQ !t\Aɂc\+52*Q7j=A47[5+E?] R-s:ތ+:=鄄[6-r-Wm[?ߛ_a[H:*ǫTP]'=Bdׇ%X0jy7)YC+9˶6DPڦ<ɧ?BYGUW3a't*c`> 4Aγe2ʱVr*(#PIdk9:W7YH&NJ=-TLi/_~8yL#rNC/\ft'xږON$ P5& ._w /fS-<^/Z^Cky"F>1^-2I, 6eog܅)eřUYxʻԊN6ˣS$%lwD9z]1Dx(&dz+h]pR`v]QӕVVYb:2oYDǎFO2KB_"&zSRiL098ʰ`Id&u1E*ŅG<& % e:C* ylACцܵ#܍ϮJ}j[hҭ_*ڌ?D7+y"jX>|xt0'(KJ** U$<0 Ef)9̉ީK&`gٰ"3m& gQ ,&Rpo^ mܵ׵4xGUyL76 "\;^h͛gu᝕ ֡Mk^}Bwܦ6Er@ṵ3^bnpiC? =e/2m4Genfvv}"UqRw1*?zS?La%F&Kǽp5|o{!2VXvX3X=ِHLLx^Qg8>$'3&cei郌2'.Ycq 7;~[- VR'+RH֔|o奟ն:Ioid_ueڜ"io‚3 sV>z/7W!_l^&&ѻW^x{|]LB5}+j+WϏӘÊ'*(Ϭœx":onԓ"*ID.Txw"S؁Lw!{be J)bӕ)W؉b(v\w;@LmR&x EUVxNW;mMdIDMIn/(u]ez7dzAekhgmhɹcB'͂Ӝ4uyO!'1ud s,qR:~y V@h?}^*K.s4X38xH6yWRĄ* lьsC#f2٢(1L1' H+"lQ9(IkѸYXn/^i9zg>y%+ґL6، p@VdsBpB#1a+&\D#L)OLB`V^yV4jgu{MhseDԏd/'~?|76j~mѦ4G/,}_KaRݥHQn2x׹rOk9pI-@No9p+Ym->Tڍ  Z8[o uBBdJ%Et]lbK\47[e=)Uq7:ŨXdGgek$A\ 4V1mbf&`mH(>:P+N4WeSF0FXar-ߠ4PCT #bpǚJb>UGU&oY$4#JX2%E2,Ѵ}J~%T@)Iȿj,մ #)1-S1́"QI,q::'ìb=IAڽWBm IF!iM7W:JmX؞(n 7 mbO[e ]-ˆɚ+f#r}}'HU^DA'Wy0t#~}R|0o\O̶׆'^Ó ~]1&KԴG1d $&HH&h)~:wax:#!Z<[k5C_3ctLdunH ҏ‚p8;S0 ␺.}{c_ *>_Z2⣊-Ʋ}7z\mjVfjt[R!89R{omN{KZ~?~bvUpv`lf00gok+ڮ8Ԇ?^D/a>z'P7 nV 1bᨖQe&  =?mr v{I:rh:-ұ"hR/$rD>Ǽ7Gب\i]#qfŇ?%|Y1/e<"0I?LJh4y7?O{q5[RO??-1~뗯{]~oN/w/He"kdr!.kpo_x]W]Sˮ z[~5 w!0v@/\TȻViMܯ|< pCH1ALH {L 9 a?Ygc{ҥ>z Yo[HegI)|T%8C(cd>d0$`H[F&KRpgɷ} Ěmj0O͵sVa=mͩrͶ+.7nU[lOric^ <-y:(?{WVdin .0]o3[[Sms(J%*eKDeۍy✼,|QxVͻ^/׫ej_Ot/_ep esǙ7_㬍iLyAȭzqNtB/mJ}j&.4,0f&O^&~JFzf:xCO(*-Z^ř*y3,o.7Z8खOv}+:t_6:Eu:xtͷI)t4QzTy=eH e&EBl:UoSʼnPt&JIz:J!M*ˇ>&D&KC*eRR}ut<;/!;3mh+0։]qLgWWr*"…ppV߸e&xhW֑߻^\fju7\rvū4۱qetg4F}ʻ>2_WDrd'|1$Lf*LC\pYb%eB V6*]Ѕ,*I"<$dv@eI H^=.vv}7<-bRnGR\kȩ$D ВƋgv&d% Q #n==$G$#Cբ%PRZPm*)rъ;"M8ʇ^'Nli8Rs&)T0= i-6y#SP)_j\VH)) LVoPň5 pw*τ,3)$"FZ+ HM5Ė~ _{ {,T +Ԝ 5 f AX& @C4s.0 ֎\ɾN.9GpPRp 3R'D1m,7p`}#XCdұI"Z@*)!Cb8!pe޸=7]GRH6Y'o&AzR]PeW㾻. P˚#!"K5,3mPH^%B̓qW4@ er>@a$*T>Z}p.ih@gm nmfRyǬX1 *XL&Dh {aV9 ^e;M7` JR$JɕHraUA30CLZe /&p V "Xe^Ñ.Nk^@Bٗ;ea-!XXj59ҷd,4^  DxS<$Ǯ6 $@f )r$P?z ZhwEdU%0C4,>Rc;*gɈ&B Ɩr"ǢM`k>m?zwH,2!VelPjJ1MZ~VDu42Ŀr [5y+Zr0 ~0dK-CV>ܽw7vwsϏf*k|L=Rs+@]4 Dz qu*MN48: kp[awm>6,,fwm+gT-9IF ] dL?&l)]iFl"Hˀ,ʏ_@/kt CB1Ds ^N+w[h.JDLbHtPX&R 4*DŽ(mz x"W!0Bz,-P@>]m7G!8iJci@'W Wݤ `n*@O(bY1"&UcXlwXv:yUi,+~(]ڈr2QԠ48*pc͝2VnfH 5*Ei L>ɇ&]$@&c; WY~\D^NqTho%~fS"ƨspb >v]`0`pObi/A+Is!|~) '5>T2'ʮڳǁ˻6,JYR3D,0r)#:&lϲIh.H,;(`J&( rVVuwB߱@_zq"ϧm~󋭚@pYʝ,5I3"Mj抰8=KxμmfճIxuZW|k)hE=߮nQUx^\6;Qk~v\/z˺e <[q`I 7vN/GrJۣ唶Ukg]+u?Li{gU|+Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(Wj\Qp5 F(z =WO߹a~ӂWSp0+jEU|@UWzeUDO>w?fTVA3l5>Mf;#ZL{ffZ3̬,"U^=lިu1OuS9!5RN1pOe?ˢpE7^v-!G9X2 PA\sCXEclM;Ҝ'" ׏}٦=Jvm͹qMy'd8Xt犊[n>{XՉ}Ի38\~Z.Q.rsէ{vw.+N.|B/K A5T$q ŷJVRfP{dKu\ٿP?YFPu1KG:󳯏\:zɵŮӝ[G-p8q{0؀S.͏]~x'/ߚ{{~ҵw؎oo^(*Oae稜M&_΢I-z4iU+2HُJ7,bj`b"̌6i& Y9Z_>0^2V?0bhcE5Rx! APL.ՀZζ\wClt3{BbgH`WD ND|3bw3g?biْysAm=P{0ػ<,&Tm6K3Oܒ+'-SLTQR銇:O9+ 24s+ģ*U@7V J\BD9p)bqT*sAnq("Έh"DB6T$iR(9+,n^r U (,:*]"b͜CT'l^g7/9]g\.[_T2Pִ)59%p,CdJJ `J vCw4?mv*U{#ghʁX;//?GKؓ9#^Nu^׭8 4FaڸfkdvbYx`kj7c2;yq`hFn~Gn~Gn~Gn~Gn~Gn~Gn~Gn~Gn~Gn~Gnik߿P;fOݍ ՓӗO93"wg/1XsUc*FF5rϨ|RAeM :5SZ6CY]m܃Zv f=9/eEN ]6H"C~w86m/ovq7]zYBkI.3xaue~ḓyIy?瓔pFS45Aa5+GZh͜+x'ҰMzŮKwkUIYkD|Ik~9r23K2Flq#a$Yv ̇$`dl#X-) }oQD٢$KܩIbHֽuns(TʣPy*BQ< G(TʣPy*BQ< G(TʣPy*BQ< G(TʣPy*BQ< G(TʣPy*BQ< G(TʣPy*BQ< G8i/ )ab `}UJPPpu#$;v\unn7kFgB@k%Uʎ\i#5 3"3[RqZkx5ei?CoJGO5K_UrQ$K>C%CEU(DBxe JI^«t ^fQ:'Rz)>FC3e٠8/fX &XgCWW)}omI,O#ng#tԟ - \grjՔ4xTNh3LTLahceVǓLfk_\O)sh#5$q1z;D Qfb=ƺ -zlP8$0oPE#?@h# /pɀFU=q7aod.8"FqG%a,1B.*Pn$XQT#MpqGe}OVS/=[!R)&H-a%Z1"M"[p 7?xX_vd` {>NS ,Qplb 9_ lvppU]M.8 2t懔EJEq7 4aHTӫS莁C`?zf46[u#xxx˴h>tWv˴i(U;>ES0N8RzۜG7Ym͋"ϭm?`M6,ܚQub]h{P\n%%mv/^=9{1y!}ȅ:o3E{$u:R҂[붣N>ԛ&7%vIzu7u4l}. wMȃ<<֍O['#IQ70tѭQd7hQ ]?Wͻa'Y65ddG,9mmvh5ªgIP{j<0R.L6(B"p0mP~a0F }7ڶt}V[8(vdw$0/R7vμf9i_n:|sc`y/0C:e;'n2z^/p<㯗~짓x^\{r<}?? TYLYpF43{1N>4C\T5bx8Jf}ۗec pW1pg~}Wo]5[2|m:en}? EHV,`[%*Ul:C6R^ Ux}PO\߆;1 #BU  1=2E}hZ^苁+0QzIK/#pͩOR[}d[<}lbw֪-N5厾s6U"Oersoɵ@KpDDǰV'm!H !Dq@!JXT\,_]3K/_zi ybszx}da(B-88FI;D Иh(&Q묡I@c5qzixUan/~oߋV>(f};7*gƅgYb(%4 EvX+E*:EǸc\tq1.:EǸc\tq1.:EǸc\tq1.:EǸc\tq1.:EǸc\tq1.:EǸc\tq1>m&zr-o?SFoɤS-<<Ba}-xL_" JR%(?>iwDnJ{l/R?$OZ{xqQ>6  };8lyT%} ÍX)-%L1!,X eCBx"8X&(mk%bĔ8Ǩjg1{>| voSNR34׽c9BjG hmesUV(V!>*; 頼ܨ<)?iv8bD'AG?5qs'=Hc6NQdp 9pk †9%3e~\|ٰQA֗aD iC&~IAsY˞sOfҜDzY/wUGxYe$9*^1ϑ$rrF[^,7V3`ł]2FG0C`SRJ(UJ)oy,G~F;6-y?wLԤt (5^ͰN{w jDd "QX{bfo~!mF)cb#a1% ϐY"  jI%NZyB MIM'b0B&PƸTHH G|ʂIJ8,蠳͉ߧA$L{8Rsʡ%FiN!o1v0`\RxI_-ߎ,TI^t}.Ɨmk|:0,kFa؂11Ri!F|sF( f<2 Un2p6KSru0:+n貕m{4&+e-kmɞHeBBI6ydFٓM/epy+PhWq~52RLG,VB?qIssXz;G%Au|c.!zcZ$QKqtGURfBH4$%"2"%F.^Fhm8g.9PhjoJ 8ۻ_5iw.:,OH>woc4^`up{ǂ ]=v2tD=%8jlcSh㺺7*J%&*`J!]YnuI'gʠ JʄV[LpbS? lm>罥Pg`k^A6\Ȁ3jg>MLj941O!,8ZI_JԷQܶUʭ^#xyx0̡]Z3`ME 6φ>8G߫_7ݸg[*~p=c)U?_69 rLHv zEҪf:^m#=[M^bxc"V Ĕt2R.Z>n3 9< U`he)1H/%!` xL -GeZpU+,P,Š'i3w+>q6}F_,0+VgjXZJhB.s{}|̦~@IBxҾ41[P&qS` F-ƛx[9. n'dOQSeo؍ -7h!54.ZRo739BLHYW]b*/<%XB0au40>ynxX Ƒc9ACAPÔIaA;Ri Xr=+\IGŬ@1~Fm8d;,9%9t:ssx&/|ēۻ/l3Q;nz킘W,OLNVʌไYFc~z sZTD)Z1le%_+@J:<.WqB"@pRaAFF&?a\{3.3`#FFĤ˘C68#>6:'v$mD-񪆙9݃'}☌gcϔ},癠TLbG`S&J8g+FӼ*TY`ΉvСۡgCb.0)Tkb ^N"[sB‘6XXH{T-5!$4)Xaj36AP0{-#chnEbʌ @Չ߱It2?nn?CF[hUi{+>"l=]]O:q8,?t;VѦvwzQ u1a~+hj]-Z~⪝OZVвjgzz^7y+v@ϝl2m=2'xʆt܊WK8]zwAM%Icik8m2X~nD`nVbg9'f'5nMFtdơL{)cDi k%7VW201 ‘?YDsGG%WC`2"ERyBPDӉ W[~9z3'fsE=^}ٕMzXm=|mnnG/E!Jk.V<Y":<|:ĔR (Y)VkNZTwN̎3n"<]LM/Vt3fi">S욤R\ê*pT-*OD (#L,U$T"_iY4a 3Z̥0wRe@\t2nB\tqѽy.w ,; RipZ?~L^fLq$`Ws])y(-)%]m^)dDb(''ѣ@[鹰I`^2`#KxjCI{sdHdkS_ v|<S3;U•m 8>Yf,+[ٙg{rֿ3 ;(hǜ 8QKR.1B.*^tf(x&`/ZaC"x8Sa/s]\)h4 +r_^W Qne:~¸~Z[Տ^jPֲgW5tQEC7!'l42ƻ q &y҇gg^uXNNB@`|qO?6+|&go6+l1q<u>_K|DPkž-mZ!z qHZI%䖏 ؿ!ӪXu܂8Wb"_,q2ɳj&͎;qvEV~ ?&! y%g!}=aqJN9ZP[QNE_&tSKM64of_ hgaa}%8=A;w<+y\J!{{=af_xq ѓvQ6i7anP-rk崒63. tlL]`s7 6w`s7nncQ) 6w`s7 6w`s7 6"zDLJ0Eꮢdn_x˜FܑI\$,lئwvuf;r~ # Tf4!"*`al0BL4H!%3,z:zy`aVe#JC9nP4K^4'ѕ[(MXR:YRmi8/k5_wզ3ŒЬчن%$y;.[!k\8W/2MibdTԳf/g嗫\t`;x4\v6ocϾ>ݣǯ_fޠ?q8~ZNels47/7<:'!L"ʬ33 Y cS?u 鶇 (FAR &7>xIVĤ>[伪PZYFn|_J*/| {d|٨rݕqTyu }Y N{PݼPn着)n Z;~ӒE5ᎧTT,K$V,F5)7q: ͘1 <5ֽ,1kș3%Ia;ilCچB%wl!'1s- "9VFc0ű7AƢ⺱bO|j;GLvz1p$`Tš ;j54Zcр^=(œ(DJhc66χS]:Ynb@Mdwrc>G`+IM, vǨXM2֐+q3w}uoF>K]Y!"fl)Ah?u;[.oI`𺋹ZLc"S!"k"QKàQE57V"FMsFV;|tI899Ih^)6SPШ0])>(`r.0eZ) -@b ݉2dK[oҽ,]b.DQs`en[*Vw Q$ƚ2) 4"+rQq9)B1 ap) _t@1hK:$#)TD2'Tfi#Ay|{1cNP@'WTǀE{fM_Z N%qTPE&A&Jyl +eIj^?~lMsb4s.bT%ʩ [Ov<n0ЉRy-azi+?VJh)L*`(6\i lPwj?ꈣj ?Z\my=3{HdKT)SXRPJj+㥞%hjxR>mO/tӣE}L&WwX+Pq 8p("jHA)mFpf@U<+*əiC(XK KXf{'yax?]W}Sj rK?2HU}VU#lp8Q ^+DB)bm [x5ڌSEՆ9pj* ﳅw*Zg'O~-ǧ7]n=cUDHHjR= C˛{`*^ =9׽c~%F|WMČIF!<@>?yݏY6b~M'/矟K[nOjf3_~%5&q;]\AbVyUnq:S`Go_^L V]ӊ ܇շN¸kRC^ ԶѢg"5\6iga[{/[\떴*{=H14NpY4a#A29Hy3G%KUUbG*WIS}{f٬1ކOg>^6>'βo"(]>諶~7zzM5fv۩69amoՊv_aHZ#ia xh?9M75螟tz,,"TŢ-ÉoqA vPybwU3U5mRU^F)flNJhdi!#'tO٬*ChӠāhѺ 2rwՃp$Jmخ,\T-BlbK0)*W9iC:Sդztkz~g?>?=\_;X!ʇ9ła|!fIN xX;|lR*w*f` Pl3p$mI$)) _`F+s5EǦaBhfUwJLAվ[35bhU0V .I!@>!k8S$Yh#E fz5gE\~+ruqOӧ6/v1{vHsc{5eEQ f«/W2wǫy~9T RzJ|VJ*B;al ƎؖBD9Y\%TU:d96Y<ٜ q黤dsDPRXU"@Ѻ.~^a VӍC꽤٭9:Х/R!dyS %Fi#ϧڤUonVU<9s5 kWkRr +ijU1l&Wc`a-;%j; "{o,'*`.zFu9jYT5jmP \S+TsAik'/2[+BZgݚN>>6#*Ǣ,}&9@2,dQ "ALR*zp}@? \_ K?T`DPYVcb3dZ!oKIē&tOblЭDȱ3NEf1&- $X TΞ4)pt fDf57*qaU.)0z1!1`I^eHbM5YC/X۝W˓[5arr9@Z[7BgRn({Te (譄XX. AEM4W1Vvrq FU׵zǵy&WK[-W 6-*!aJ6ydݮ`0粭! Ƽ=$\4A@ Aؒ@S$O#$vJ87tG\kL4ĦFw0E-ZEa.-ܿ/mnyҘC(:;sg4OnEef\qPu7k-x۝Cck'  :B0БLrG˚47,& ɾdV( P CEӪ^!'p7ӥCŒ3M8kb9dE|V0TuQwI oWCsҝ96${"u߷4fÝcQ]o~z/k;Oo(P!-TzyTXc,(tL!)`p%ְ8F^i`MP5:^b*:JmK|]'ɦɕ2dT!!;82bp,P䤱F:X_x4.Fe/ -JcÅ˓ up)@n㹗fyƺQ@jt-c6 [>d4PhEbPy8Ŀ͜`(} 1Z˾b,h?o'~B&Fdzf=!{)XMv&R v@@Qq(0&%rb5Ͻj/8_ځ$0dOU1ZٱS)F0'P1Zt%j ]P ljLB!x4Agҵ70֜~Vvю3̳& }88% m?< N煠7`?_|!'_~т~ 'xJp8{'_b{{w\74J:?ZN?~M|(dR3)Ddg2btc`B yXp !yє͓8 w5ʛd=i (וlt';Vc 2XR/l;,%t2ؔU袆7ޠ$Z&4c-C4z`|f4)D%lվqsWJ,ŚP+ 4- Ǫr+Ǡc0l[5ΫݚRHuW^;ylivͣ'a}G6k"cYlw?=fȟTۺee{owΓٚ^߳xգos@s;n^wOy[8.<֝CS[kydYt/W%h/KV)qmWݿuӢ+RY8oy o_/lRjy9s^h?kR-oN:kx8;'ep( ]=Q6ٙapԁKJXV&4nzl}7'S[wه,4\UK*elC^Ue }):$, Mb_zNIg>믶bWmJ󝨀-ZTs׵bHU;cd`ib!TTuVqIv:QD5::[OScyn˼-VA=sy1yQ ߘC TUJ\44Pe=Dy0VcZɍ̓Q'? (0E1GJָ(H hTHPD k,Tw!{L՟z]~IΩ.NlM d 5TpZ؅Z*oףϮV)Q OD+PC*z-7GB,NPB&sʧΎڭ9?+ ,-k=?WkQw9]GG^j_1Qy1yL* :8a,첥)ZS9t@Ijw5ʬ>W ($qKщPzGnYane uXS,k,SsC+*",.nXp,]V C<>>rqv#&xCYu`ũI&QL+ ;ʢއAH\f񤀳"F >fmVm#URy0#处"d-8x:vs) s5 1K`si1'giP1;!ՈU!* 9M) tyq]}3+AVy>!PxFK2%W՘`3LQBN0sug"~"BOa /Fcb%m eڹ9X<]oGW >dGgTHGz8CP=({ ؖfz{U]:,(` R1M֤[ϷGK%>wgvH>Ϊi~od:oˈnYwj޳KFQ;-keL9zŵ2QNe;<,USX[bb|i=AOܴ'0)<w9b0f כy̥"-Z]{fًƍFYUdO%KTT7kZ_s[Z!p7;/cV9J M_&^.XvN*-UO ދ pc?skax9͙) "%m.MJ}]IWYWp9l4W31WlÓگ^6?~U FD"W@b<=Y{q Uo8 _Eu""?nsuNj1gcnDU 7/;!XvxtUU<[z/f?Cs0i|9i .зf-oOlS)lӗ?p/xKymrw % %r d'IOOœjѓb|,?̋H w**_j_"6'jw k7*Ά# s\ oqJ( ES2ٽ>/6/qՋjg/opJ逌/U䇆Ђuy~4hcy9̦{S 3E y!JߎF[޹f_<ជT0i[νyοx1oƼLdb9s>t6t Wܙv5䬘 ]mC5;'6]N JAPkrPZCA\r2q,[S n^N)./%Xgثt ^Vrt6}EQAI},b+IWMF4{f{.>P86:+We¶:~Ӝ`Nȡjz}~Gk 2xLy&E5n(*z]͒䑓S 1[(0LlxF܏ QGY0|] ' W _Tm_*nP)zn܅Z,$;$hJV:1F9AJh2R!m.{^&!IȅƉwϹI%yTiĮ˼t6-U B}`<3?wmKq[߾I{z)kM|ݲ,kot.^UQbrX"]Z !PDIшD x+{[N$ ㅲֿ"P FV< " N$c \(8dP\fB~Hz+?V{ #C'<4nCKPh$39%80Z0õ哲FAH'4% ?0 =ִ #I)B"C Gɕ7b?hIGN<&Z-=#uH;xD{dHJA\zDl0 HovaRncN$iφPwbN=2U-`+沅4[DNaVU&"&ӓp/O fԼnriTv5\4$,~VD''dzK%~.5iCrʣU+r5n$9 q{Y>ʄ~U}Sݺ7VZ2<}u쬾l0r1s~0J|n]튡Im:xvFkAH[O$=uڻUbr,h\(N^km~ej$|> %wIȯΡTczDƗV0z-篊/?oӐ?4~ՑB>Xy+DU!6h~3d'g~<׿>5eo>uAe51sp}]u5]s#uoowWU(c%B b݀@Gay[n lik 9 E< r]@KN A^PUtFyVRK,X"s6M [8梏|pPe22;TMg4S̍^K`17K΁Ge&'+nU ؃D J4&m7:fZZj-)^9Kͨt[c$뽙79P*8‹k#Í< Od1D"I"aQ|`DASՃxҴ9 @&%THj9y:9IF[8F T`^t Cy@S:I&.DOFu &(ki 5)rEIrDg-n47e#m}ŵc)o[: ,r+,(C1=@UʵSP394'b[ UR 0@9jEiNގ^(Ձ ԁ}k':cJD1`\镐HМ> m Ur%NmPۑun;ѻ92er&pAx K. =n`Zt% zxNXbP߃  5"EL1H?ZFHΆ}2F RNZe{='GYuݜ7λ2SgY3Za5[MZѺ9WL#&!4r4# B4Q$ 9ZE$ޖI`^QRG({^NXdlO~c @ -eNj Qp6O|G#,Jrk 1J8it9P yŻ3yiw'D (GE4Nr%٫&o7QE&vx 1X;kߓ֞_~QN2|\M:mmA+9i[Ex[={ ')cfoPomQoZA\˸OR̕'21g;j{Y3Oi2{a$9ԸR M‚ $kSR!}E4P&NƇiAwuX)ˊ`U|5eoDz_!#Rc _(O`hTΩSd D *ۧ믩1MHg- ne5 oHQ9MM%2yZ *6ym+/N Ur14RHgKV\U TH*2rPOZj%hDZ(#ea&(1l )!TIl"@)P1.#T 謏 =P9ᔕ@TfΏńdHޗ*]B%!9^&P}L5*\Hu(mDZ|19/5D BÉ!kr!)IB]F`Q uL:(BTN*JSeaL3RĽ"՜ `: j8#,C)Nv Y;oB{4"*[lySSTH54SL2M DDItk=vƹ 쳭1 U e+@|m*5e2Ií47[>s$9g $׆֖̀p/I#IpV qwg gK>;,hab秝:r Yw\Ŧf^v7}֋v>ˮe[᷾^s =Ƚ-b%].w} 9zow/zsyC\9iv{t盋yG ȸsxsͷMvܸp}{q~uh{D647\zyqxk5?_!#QU.UnGmuΆl޾>d<:J%xO$S0]ؤ<8)Tz׺M7FiG?g_սivv:uͧۿJ*JBNfUlp\l)5(^*cG_e]g`Ӛսd%Q+yjvQ7iBS_Zn+nj2YVgj[3iWtD)_JdbSFET-qqGjOpU\$EwCO k6V CQ(c%uNkM\(HI(YMLA6 3a6w^Ҭ#mކO D3A{/9w%NѺg> -9\|*i fZgu}=4t}uy6|Qpj%穙j&L)8$Fî&Cd89kK,,e h_ᔬg;xJad݉/l{^U;e{KR}&Ji8*վ]38ij~ma2V<9=QqiԚdU9ǵ7o*7ZۜL:Z_TgO)bZQզ]ɡ8Aι,&q`mz_ ײ۳w15ԞO=/t'Xho68Ȓ3H17`ݖb>=*/N|^mo w}ِ]?֕q':p I 泷 7^OK2D17/,,.[KzZ }X3?aaiG3AzU-~g |}xszyE 2,띱T%4l۹GKrd5BzO6oNCPoOV.\:_5p?pZ;=%oNlޚO,_ſǝw4RN6ʅc'?o'm&km65܊5;VǞ}DR8n5yhT']Z5z$O9"QQEZXOUSHT L29xӁd"}C&j}Ssׯϫ:\\N^2 _-WÁ%߂ ~|is蛟 vO2רOߟ~iniVPE^WvIK[lǎ(WWc_}0˙Z|GVR??KoR_B2|nz3Pû;]OI9P趵i9ӣ_ӳb !35fWV]Йệ.;g>5廛*bfC_Ѹ+>W=97%]CW?d\ts5W;{sDRz~vv9߻PaˑOV9o0Xt2/GX1#2/3pV8d^̮2+fw+rZo#4ߏ˩آVds"oUp^wG^4[W1.7?_ϘQzߑE|uݱvs"7W}o۵y7{XFƖ즠rhΔӬ%QfXreұZUR.XEA9#Śzx-ЅAF8l 89w{Q^k)EL$?;k-\mJ>6wTsԚsR>4Vidl@UM\j)!'ZΥKPC21^#1ŴV2iUcj%Lx 5j"&6eT?>Z+K"LAt(æ@Wa2&?`B1fhd\ZIj)x;Q~bm=c̷ws&HKg G|JJj~I$apֺK5.m|.TYcOk;.T d8UNF7J_"@]}]\5$x vTFq@̮X V36:򉀔#Q[r݊JNWp A^k!O#8 5p;o]E bl4q,Fn5eEAYrӎP<⌥{E@BY3kVKnG%' #R J:C:i !LHȿ0g^dz@X8piT ǖ8QI6Qt¾&(=;)<ѣyԄ0o _4XH茸 j * <@V !XP^ҼGcqT@8}0(klm+ ۀ8 U2YЩ ֏oT>hjB"dpa[&A[TH7$hުf?]D؅E,Cϳ.*1F;K8'7mPmB$- )dž Fms 7@1aZ%\h D;[a:w, ԁau*I(lwX꤭;XFpj!huh4>!:AȒ=B\1vE1 2SJĠ P@ ZGhwEDޙ W{0q( /djY?KQ@-儤bIZȭ {}tg9 Hh@p'޺9H@)BM>VTl52P[A I f6sVΏCCOJHlU\X;(m@ <T2's):?{㶑,"% `}C6' ''1rlb}(Ho5EiFCy$ "f}lϢY.5 t`vDM"@#H<6.xד fi5:@8/S  ND4XoQ- \5nSJ"DI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIOX%ψwK89+fO"9=@JJ$I pAI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIOh}N${C s$$>EH xH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! @wS-ݹgU/ÒjZIOmB!X{at5W IOm!մ?(q kk<*:3\ igMem'On\b߽C|~,\ N]aDqjz.]xP>PwѰ QUH/D0V͝c8~/l# ]&a(v 6n/K٬߳ܔ¥W zݽd~h;OίKk_e)'ߍFpWxk ^k]?igћ4緲rK[=bia&mayK.>|t\Qf;R[W8WvS>?/GG?mZ zSopA7'}֜f Oՠϫsa^HC6{oӋհ ::ӥ4Q8_m*qs;5C_c<]N1;Ssz!M5 fEzL~ 4M.C8ow}PY^tk0_5m/W"-,at5TWS ,ۋ4M] ˬ 2^rc?^ )Y2|3.wɖd{fJ"f𢙭cl¬ߜ7ո^}o]>N[_Wmx5}Gͼn0$Z*2ŭfV̝@W-OEr%q6脂ZϜλ\L~<[0kj>`sm?qoßRj)d4FcX/5yl* q̸s6y.c,n⻨%@L5.UpKӤ2NCl4~1*Z~=Y~ɺwWr'7xۯ }ޔz6c(trd~¯6^=JǨ9 L.L ֔S xY:*)gR S`Pdbtj(ߡeLݖ;_3[ؙfj yǶ-<-\HTmzUQRҨ S; 7,wBs2҉DcdPBE S~utWw&ΞiBSy| 3 q/ěyuohz!Vjz1ky).?\dLNj 2ŝ)t&d4 7v|amban+0|ȒҲyd9VK5`fD6}fFA)^ _X%2VXO8a.#G#)Ǒm0?d4_Ot7+Υ$3/. gKZ $ iU'TJE#x*8Y:Uqs3)}`|| ~ÇA 8moMD>ע85>6=/z2.BOl1F0@cKH-'2N5gLhc.j #. .iG]iUPu4W lRi&ݦQ-' 1IYj  NΔt ƝS)R e' ]WL݆aG o6^p k:.[v@)}Dpjo塆gx]3˞% & %w[. =EA( au^gn h2hz6<y)SNQ>F"rc䀖SuE_ߪ& Z2r(*LpƀANuyǼFisAeD2ю G}gfaZl]8%Խ#- 4,Bv/uanݽQxBJRY\ LF^YKp 0̝-?=uOx4LO]թן6Y3t4sf.zz2J La/4)~?HmEѴ> Ei_cKTjWjmZ*_I[Z\1˷'/ yƕYE{ 鰒$Um[D\HA r?zapUʔo1YRڞՙ__%x6_7-{MЁ毝ggd~e(/~h2}! \oۅX_//޿ķ/o||՝ DQ^~A?zL&ȼ(pFv0Cbt4U?@Bøx/G5~sd=^]4G6WX,~aJYT(}%T)G i}˯^~h59cm&z SV.5g? bq%  cПϾ)F\ߍ^ÐN[2rK+XRD7_cSO#}Xu8[6`ؘ4]xSr[X;)c>g/ dYͥ}SH~1jF?@r>*ʏiy;v|j. h۠X] 2bjp$/,B.'o vZ:4;xO:WRmxd&s~ j[V8b&7x+ )rbA{֝nr]СkC훛Ϝ?>R :yT`*ARI\EŴԂgFIec:FZ\d>$$uaON2N$g`XvT(B em8e4p,dК*4z!4Jtw/gNjpC\]^wmI_!e% 8x%@; F?e$+Je`>%M$%NUտ)a)i[uTH \XYMPYUO䫘_uYߺA%i29IkHZUIhQ'l-]J@ҕ,n>[b=a } ԳG(Y!,4HZYh%9/jHSi&bBЂדzzjiFS!(8SJXQ+D().R`:`]fM83RB$51COEboX eA;bas_ZoX?}aipGv˜ YWjڪc .C(3O ƨ8<"SSIks)V |zJ]S_XJ #ݧ +(#$_8Jȃ88F)P L&tϮ&,7gZfL>BHBP $&a=]kސIۨY$y؝_F!޷SC+Hf;Y$:::<~r iVxKr|i++ %׃kLWښ~>}샗0Ћɜ'rvrv/v@¤6겵<|=(ĺ% #FaV ᨕQuM>]7zr׎*Q\7ڵziL2uHX3mtj5yT|0M-wvj8mJz0p2Izڂ z)KXW8WQ%aWGV4: K5LNx2>[f˫7?뿯^ŏހto^W?/|.:X3%ԃ&كO~оyCNW=mtkۭq zynEc (^kYoO&,= 5:@o\OJ$ʘ(3BZO9)KH"!6"]BVS68̅Kr}4:ȳҖB 4R2$kI2zr̖bF!K|dd:٫XhbX {#c%۝=> kwn-:zםqݱU}םkp݃xc&o|@cQ١`YU>?c:Cͪle%Yi}e,R>lToNPMneIdOp~"SHSQj3Fby7C߃ZFa4n.lk ^MI7۶U CgzJՏn᝟6.>+vhT5G.A Jƀ"s;pOl[`^2'TDo%X2i!2;@?`x1zWeŎ]3'_e%dϟ|5\fD?Ԁ*e:BvTK 9k% ՠgNw޾e?[M`}~[ {à}goy7lY!.M xRPe@ESY VBMJTɫww?~9h=~7>^uRtWsQLxAcSJM4!jLD}L5_i/mǹ|1&-DJ̄69%bJExRk-j*$&gpm2ZDMMa @J3RZIbMuk ]0u5erRfI;B}1"Nb{UKocB(L"YYV\t%rEf-ll:[y-Pzp.c 34=#X>RĴ !+DE:Aҽggz 95%#x 6H(,($R +IAtyI\1^ZN>b&PFF֌S. ͂DCJM|%CU5xw7J''rHLM%#MU6s[w[M_'|Ȯ;fyrmE%ʜ&Zΰ k'RD"l¤baB`!{,XF뽹EU\դ^)ƕqm̚| % Z Q=N 8(m&Wgp_h:p9I.jiLJAEFO#DÞD=򉆽&t^V^rA+$@T $EF SZ@d4+yrDP1daUd`4|}k1l:[桞}@;nK4AbÓ Bh{|Uc F v|y5:_Λ:{Qcgu]t=z4l\In;ɍSTg.7Pknr࿩T>nuں=oprO67f!le3ݴl|UJE+-7h͚rhXS<4sv8ZYsAϛnܮ e\yʈ.㸏""Dlgҡ嫚[?n.vwn^ϊ40ƢXi˸$1Zo,!kT.HRBA?.qwM{iI{i.kpQ "DkMT*DK%~%k1nk%ʨ2K(JHQbP;{ȀSx 5(!%6L0ч0\ Z>FMh9{Cz? i-'x~& |[ӆ;8C2=l1=bv Y4AՇYkb|V>c]?~. KzScQǔ [i0ӓA,|cG+Xx5oG-wLďfO=yI``z˫d*x~LW, ˌ|05b5? 2W?w*"Tdri)$}2<2.=+-3wǘDTO].tM!GџCNy6\XfSE`)D%C 2`CSq>,9[ )FfR *o5694yA?nQ`V\MϽ=^|3սIgI uj-b5;!Rq^R ϴT~,U|w4S>162//X[e45-[G‹ f~|tͻY)Į i4P['uI*ĕRL2III忟p")IC^$`0Cw/v[RN}Կގߪw4Ÿ|:d.r =#iɮ^@\.vx"IOܒ^Kr'\),?i0" ɩ:/Tt׌JjiThlI{&<>Wwf`@gS#qAKT$T{?<(</g_.k^`en[axbR݌v@W|upPq 41ƸudKӚqkwL,UK"GZg9Qr>B^L1PQz݋d:=3#>yLnia8IqK\n I.BRDk]TKyr)OeP $x@"3Dp*rq KAY멶^+zн%=6$}IxE|}$GO"!BE>B[XUEJ1KfWrmFKqʌZI^j%@iy})"8kp!&S;c$:,w&-T:E&)YRa`j]& ?È%Xܷ,){ e~Q,?\>7-lm~n Y^#Y6 ƞO$˦y7{eoWp3hW jK6:vJ;aD*j"ղ HmyQk iN꤬Rf넮˿ ƏLZl;I$-">p9f9ggՓ9>6mdS=YACO@ ZKF_zdTё>#SJj|LR eQhE PKF˔p Qv>3T#.os1%sAhrbXMPAŐI U%7Ꭻ*%mc"ZJc˵#WRD9 E  R>rL(q.^(}ˌpvRf\fA4WKCdZR*jx=^(0́'z&mr4*fPNIMDKl@u-5:<,t=|nȓa\FF&S@L%+APς COw4;iyn ԣ AH h8=0rhR'9Ҳ R =NxBQцv\;o`eѢgw?0mPb<dKR jt ћ8 VH"\jTD@!>iCcASln7$f0OXk봜-1OG8.\jit4./c(܋\\մ\:wN4 1٫nĪYhZ:LXW|Wլ+5!|p7~c߇~IL\AKR 8gqYŢ#r`}|c?KW_~p5ɉ=oCf.MJ=hu=([u9.6H>pR_5ാ?,·T"z/RX _8 OsI {/< ɳ i17C"Uaϗ=^#]z]=?|χa>?4~{{~NRyWCKooO#cR& \A"xKJ>q \Pj*Xh8Y7S_ێEX|2W!E_6W7H5JdnSm`,K+=lhr)ya?VߨlkB?6X8)7-AA^w})h|F/._P} 4V[9+LxP3zWcʟab&(HxraDz{l8r}iM򀪆RNN0AU%¼3lvcުǼLfdg)>66T/,7kY0%Aa;lCچwN˟mxvr~託9@q s*mH$~r0S|\ s܅x,VShC LY^ZBke@3e4hl v@/SOoz8<lc/Ӯ\xXu9}EQ1wI]($I@$W*hB4.ؔgOa8{"gb%Qh?y/l'[Mᔋ|)WRu˽8S !/N/޿{߮ :F;8/*ʃR uTx@5ifupfApep[ڠH)wNESaFU3roׂCikۋD*02L@  ÉꔙcN* z<9K֚hA˶Q!OJD)0Y"Ny! C҂k.L.ckʉ&։t MR`cJI`\qYmcLD"FP m7im8; s4=96,呜B4 83oǦԮU=_, yw)OeG{μ˝*üaQs!d$R "'ʪ)zg8f[I܊ zʶ]gVWILOT)JURI.*W~-2BLdP\n?:ݗKB}q3)'ͬrBXM FtԆDS͔LZ;υ 'O5HBpbj" 8RER`ip|xt@通cvi/^e fML"($XC 3CO  >a ?=4SnZh"e1]@}]\ i+[i1)CœT~r0Fn4ㄞfdbwP)m. /2c#uZۋudU8A22_%v2n?"^{o-I%3zzBD~s7| 962ꮊĂTW#rpf17%T3DgӋgz U}S+|bҧ iJ6p8+ᄔ7UYl^K3Տ;SjzV]x( b> |ofs.HԆgv8A>47qMMͯ5SAޕu$r`\@ #RyLI $ykTKH(yjZHTK:iv?Q4e&?&zhw;[78 WrN腐cg2RV|18ry >LJNLHDcvFZݩpW:"s,\%YF8V zGeBc:GD h}\}C3UӼ>f9Q%i>7A :lTx5wa\EJo$Kـzi!.cG?B oOi~{' У0,зQܛ՜?wtF ,k%OL}f^w8ɅdʴQ u/̑@pz?KVbpue}Nn"%?]:U5%4WW֒D!5^C>ͨwHfՏljr_b"4eҔ7NJT#_=`O?hp:Eռ U+"}zsex)"!$XP:x ^b(1Qs[!b.bU-ݥhp؏uMb: Y"$Y1k:!w6%v<"> %L_12Jg', gJ˭媣a􍋍rVgSh|N h HF( m%,'$I+Xd'Oݼ0Qaנ#:SYWhlRf[/Ze8Vó\dL [6mР*4rguo#"Ġ!ަ˹t.M8^/%5zN@T6Jk,B@D v@1MƈMdȃPYG!Y2y sI3ddY ܐM պ9 lCt&@--Vj}3V'Jxte-T;t@0gh,aZDP=LaD-rJk BAv*۲V^]XUV['a5poz`+ewep Ld,8ƵY#s Ub1^k G}:?N]>`s<(Z 5ߜM} p4M<跗]#^|8-}k5R⾌Qvq5[.2(, }l%Q, tږ~ꑱ Πޅrpfԓ-m·r#ЩCx_{-7x~0'8ϖN+Dz7:XɭOoH|  rNC/\ft'x_Jr))ɀNx} K:I~|ԛ; (C|oƾQSݾo턗 no\Ô2j-g]KjŷI'J22xa/;Vt`~Fw2@`ܛHTRr.m@XP'`QALAIV<;I;;#n@;)d2.2 U"B#D4WBeR mU6N%8&!2ވ!PBhGӖHoi ѲrZhY GZM9j3 5Pq#iM4i'm<"ԷҋuE-mUD܁BZXM[ M(oO{D2-4xfN<*q0Tkl))/jH7|{e磢X?.WDEqvZ[H4%d 6y`5gw h4=>=XR(3qg\E ->ȶdY:bRwHv(1܅gbA ]LJSyaQ,c"r+2:F/uX;qwnw8ca14}ˍN״KJTz5VCI+E<:EudcXMƴ6s6%萭, պQoDX3%78PVNw6yHѠL2RdNXBXʹ MX@ZpHm 'c1:VD"j˼A0O;WLvm[v;YmtISn{f{ "S}]M3ʎoǾ-m2Z֐kd((\kB Uڀ H8N&n t@Ȯl5cwDgU_p+\6dfCƈL*0Yac^J"Z/@h60hɒې#A̅ƒ@o%r깎z ugC4x'N?u KV]=eAXM[Eeuo',?:214#֌LpNdGJы* h=;фc]%ie+/eθi.\q >!>jXa^Y&Fvv} ՖswZ7<+B4j1 mpWGD*D[al҈-+z1n vWOOw>=qoY(<O%ٓ OY3~)0^Pէ~m0'ߗ5egyԏO^HO3O_zŌ9~"(e874]6~^~ZZ#d1av^3#UF B{v:<5.'x -$ZgZ:^?{OƭL:U+ [~pu!QJ!CRo~nH45h~hݍޱ7µk\6785o7(OTp1PR7(OMGˢ!&iem VXf)KktI`_ksxQܨ^^L"Ԝ4٨[s¥7{&HmO+)ںHNԔ7ۃE#'-L [ e.9\q, FKjD{k^(SPXQQD so¦zAbk8>n5|cW:Fnd׻oֳL@"/3i8K Xz^..-J"bǹxk`-Rd4?箊kkJ8uh ܠ%X8Z> DR0 .jAF<3ٿL)}0^T@\v9Fjgtf|[yu(?.FI 1JO5IcN)!5˖ /”3EMcwh5Zڤ:̨3hQR NPjvWY_TpxSjbլW5,A C򾪤{aA݂ V2hH"Zi aƻ%=IvYhj0n=;͂R.8r h&CEU`TL4H!%z]{sŨIՕ+/1fޑtDCLBzZQq Ag- MCeǨCf~I6}>U|utG'5/2GscO*=~c/_xTt{ᡸH2 ?^9 \MDUU ѯ0V bӧlM,U8{=ڿ1޹v ?]W}?[_ `[h;mVjb Z(uȱ6g=v8m٫8|Ŵ <{+{ 5z>S{ q}& Dy=d'@I5HpM3_X"GSO W ٽ7^լPܡ1gc^8ʪґQ :eXMq |E54T{1eb-W uR*Wo*;m"Rn烀otIY<nL\JU RU3ܰ@kv^P3h#ުt&hRN{|\xtgN9SB 6m@m<$X#":cTxA$Hq 8F*IXT\m's&#֮hFl-dc>MR5)eKid_d.w'[ v Z$X.a.PK)ܶ}t|vU6ke-eȔ`BD&pAͣjnDh:vfNx5'y1h5YFC4*è0T`[WajO4kVS&HaP`D̵G)0جlOGTr ۪tf]=Tjm=JCXs\BFB0%Fd[v8*;G"2ERU:Q0!S&aJ;f2Rʃp-x 5ezk଩3Lθ|e#a.(A RcSmzt˰ӫۻ= ۲У9Cg:N|%E s#qZ Q1x&P bcʪS18Bsps:zخK(#)TF;̰4N+TE${vϥqK4]LԓG =4rnimDZΈS(2 QT >"[X)7LbG :!$ΙA`ι`ESa8:x$` a.!m`JTA 6a+mAmc#]z:~pf: YN?ߧutGq&W"?nOL DHd 8  C٣X~R2FjTyBa4Mnjߔ6j(egK&OX+PQ8p("HA)cFpf{;10NzWUSP& -,IVSpGX{_$2XOW)93+ei:>fS2j`eߪJNttt8>?aX!"J;nV0|80N|6 Yq GH.&9UId^Ogyl0 l)0g[^tTv@Vr>.-OM1]^(2F"q$Wt4 iaMHIEQiI/zu7MQ yzM6U1#Q2FХSs?f|;8O,+;,r[J_ίȥϏuWi N&/]ą| ֨?<+uduU׸kf7LUL[L^g;8"9x/Ӌg^yz ؗqp(wwYkC/1nkh*ФYW θNagugd! dYӱܘϺ ̢4Yzjv߸$)r3uăP36DH4X`J#Q>]e<"^OoC\zo;3A &iJq IL#V#E![/ lh,;_NJJȅCwϕQ Цޝ>  ;KNA<+Ī}ÄpdН%:eW a9$0(>ҝ*(B;XNxM}p[qp6\k$]$9 I4a#"Hc":egww .ٌe[Z3:pu@wҡ.;xj9Q/A"z(S7[,xJ )# 3C`t'}|= ի~t/(c0çd9N(%݇l!\v^%Pˢ}$o!\R:(-!13>NujL6?D>uV.R쒵+=wb7qe| dYE_1L]Ą"p+ջ/(H)D+4Mǁݩ`*XR޷sr*39:繼S750NvupHgo^׿$E>m6ꞀZ$KI>W|2Pǒ:u7#Hty:Q0[/+&,ޢMoz~rd"XĄX'=A k8GH21C#,> .EoݕVAzİ2ܑsdkD ҸsQ eC){ Apmb]@KeVѸ䟱*8B4S*#*_јI̓{,ռ}B1]hXWqֽXijQ2M&z4E& LWi'WI/ 4C46 Lim./BښzV;Fr (_)L.UVdׄ vŋ΁v%VwV)&$6KIϵ!,(@G硼"ʬuKӎ?{WƑ@_6F2RW8a |~V$CRvEUϐ-"M,Z 69z꩞bV#2R em'3Yk<xjtuDC0֗?inqp+&a.oZyZ_34 *VANҦܖҘD""Ҵk1_i}4 %2ono=]ie ?AJݖbĀ+&E:ˤ;!IVԉ?@G-/=ctWq2Cp<*bEH. AnU0+i^mלS*]k77FN6YP[я-3]n}bLӷowhb:]Oз?}#?6r~Wz7?Ҏo/}֝jE>JŹTm㠖8}9ZVNnGTcQ3:<:2@$}OL{pq9V̓TFiAVFڠk3Wm]$hv%ڟd@{wwܯTkJ2<7.-G6OhZΞ?Iz;bLԤfݣ5\c<++jAi x2zK|V&ZXk}l|)m{k˱sgi]:2l8¬:8t[%~R7HDlpjz/I؝=mMܡ v":y9x=8.cߦY"c_K$_pXw yNhf!Z֧x'A]dfђd+sW< V b%GU/r;kM384Ni,;۞3p,>qMĬ2c.BvF!V=bJ1je6Xp)H!E)8Nr& | OOu*Նe}HMͿz޾9ZuI+, dLZn:B9F_PQ#y , *[AWP"@FbPIA` VfCŋqQ1RB.PJQUӓ/m`:~U\OzK| s'ѯJ݆}N]K\Wl_JǪT̹ls2x8D!LNN쌈d$Ȃ VEFmMN)ƿ6eV]cdސ@I j#clFz\Vӌ]PTB~{ňܦׅRe$N&ʹG)gYkùHAPrkɘrFK)$dt*b +e(L/)Z-ᎈv̺ 9әBeՆpBCAjq(-}dwV0LTs \5ӜR"8,,U0+m| FE$EQ T2Vf %ssLD^(4IW?}a5Mern`OVOi]IpLJ~bv};P! 3Zm+ы6[ RT:&AV:jS6IyV.L?e"ceotTJ(!KJhI:O, U#I$&"#YTeIdH%V4={ >Աp6SG\(ahc5 pOe?\w>OHsETDF}Rն|,*IG &IQLc]ifrVK6T=AkѮXVc|PS}MmLMMbk2 fUkA+w-& xy.{Z/}}w߅4_(=߿d;9ćg$W mWSTyH^BAV۠l?C0goâo\Co;O\{QA&&ZB9pHR)||ahNIɀn KL)c^i3! . 1ǹ(^ gDŽKB^_.3|٨\{:awRګй5{)~ZI4JIJ9cTj1N:IU>eݴāK> җ]h(ZVqX#DH Uf,%+#pT NGrNtǜ [~{Q xB 9ȉx-yޒ Z9-T,HڔVPKZzM:Z[ےH>aTdWs͌o AxB`Vx dry&EȤ(iGh 2vPEiȎ!Y]ϸ z=)5c钶=ץksO:"Q2N dJ6Q9'tY]ٻٺ/*88Z)>pP> GSQSQX(Xł>J20ܢ&ZƲdzu^X;E(9KBZ8)uQo BP(>w\Z R'Նef"yln(mFTt߂雑@x)&%jv TZxoQJul0dLƴ6dM&k(:d[K|tn'Q>v&x%AsaNh`;shP`Ed22@&#h,a,cpww:^]a+cjaY~-%jjD 9FG֊(KA˼A8#BJ֖Bh)xz(^}[ob wqc6@/U-^^8^5'/RpҔ:qЩKk\[Ly2yuY?ѬD_kU2j7Ez/Q3Mlه}g\GR(xt>~ҟ0>7BQ<Ǔ2' hVuhP©bt7hDmΆ6)/S'^u{hO;@~>;|1v4o^o7`#9=bs=eɄ ]{-*6M@Tf1R;@m̼4rϟ==Qe"\TpmKeouE<_F#:^z lThe sytޘ),?l"AӢ($Yt|ms\$+ %{]Gjn71r [)XjQrqDs7 O(*@pٳ?F8y2,3tVq/hun=q=(==ƭ5Ÿ @7 Q 5 sg4\0fUb^.П.BS#lkdm$ݐ?+cBHvu2$oWiiC7{н 687P=ٓۿ[ړ!]}+n-]l"aK49-SҎO0-@kF?~ۼy5!K*ߐA9yry9N|j}=8ْ^c;vy,Fp{룰 @?h 2"b+d C2S`m+%Sm@[ ~>%4R),?Klf4NT6 k,ʀ53-&`ҡ4H!x/LdsCٻ6r$Wl|?r w8"cmlI_Ò $Uͮ*~UGh8Ԍւ"8Q`"d|cbPbB:ZEкc@v}vyqQWsƛwHOW}v;7<3phCG]NgzT%/!ܻ{) ͿvZ,;d1§\l/g?XQTBKkb2:{!{ b1ڨ#l5'h]`+tj-rP\27s,xkU΋88.׸'@lCNjӸȭbzgzCi)~q@ƻX0#=p-ȭƮ/W}n(ֽ'}01#!~ 3_׮i`[JیU@"v=piۚڝI-·-V^ G|g5IWxa{\"_쩽_ɼ_8Fjs@"`:!:+[*&Ex`N'"4-vʥs.IRlu$J")!d`$PO1& ^?&}|/yW}ssULh{5}5_YJub[?uT B (tm3î޵8=ϫCg3:sY,?䐬j%M@'ϼ-Λ,,hrd e(GWzb/." 2&drәlڛ0ݓ˃-x42sCזn70[Yhюnkwp:*6xt{'˱;N"KK&+Ikr7ٶ~E=vϿa救!vϷ4wyoU\Ӓnx8˓x3Pfɚ_ ݵ.<{כ-Kt&k{̵C zo3yA:$s򳵉aҊʖަZ$1,E9(H)ʈ8KNPJ}#g/QeNY'Y:V0%ǧr2h:0MWu.3;AyѢM[˝2]YfU'w[z۵jw] n>FZ{u~:&$ ;^lOP^e]\{êk.A(+珧g~}z1k5$_QMМ|_)^GYd W~yFP抂‹`Ή2F _NSQ~фb,{VKޙUx)=dl^n^y yz+햧wϾ켲vìޱeɭlku7׊‹C*"$+9E &b4Z Ik&HYZHNZEbǽEcdd K@"JI->v.0|"5 }L tC^sM}#d| ֔Ɉ_Zؓcۚ3^jc$%y _Be#@f{M4FȍQ$e@3<2 ΃wT`#$:DPS6ߛc :ݵrJKepv,vמ|v Q(#BR08 >j0cR»~l/wYUǛK7\t]Ug?hxfur.pmxo|%~KƴUۭJlh[v{?B邘,d=ض9sRV}\Ŧ˞hnB pTOd>Uͧx}β8yɧ48gX# C̀K!p4?Ny~/΃مᛇ淓AhrF~F߇bʰiҾZ@HٌBrp3dǬySq&+NAa'Bɘ| 'i?=ihd]s[ϯ-~WDt^ǼÿL I3ۇߥpE6V22Ry~%ۚ 3vjk7ov\#@b[BOd墄Y[Inr_OdC_?U#Ui5ݫg=c?~UڒϾJ#-#s m7߆rz6b=M`."=#r> k=iX\ޭ6_Hz@ZRH140N "ssrgh bL+IQH\&.FG &nDe)5Ж&DaX7߸`%@N|b S%8u#!) hlcmz&W8X&vykIи Nj5J57XaL·|N%"Zvʭt\T-W-c-Ww,eScF%!0+PKFk,D'4t# EF~=B־b]hky.H4yayƫm i ʬ=p*]ޝgMXW/&DNX\EvA-)ںFqQ7o.Q9|cF%p !&v'&K踖hT>n*ݻCeS@ud?&DJKh 8&9 )$(Mf@d,\FX e D_P xP Q&($q朠PXP]PILaioxt 'duTcWU_ӗwzQj,ky;/=^ SBN>Y"P$ QijFetH#n UO &!YE0(NRRJf,FvXӅ8c_] Bƒµ6ˌ;ʪ.m^ـ >4ӯ\c+!S1PphP2ʼnFڨr=dKJ90l+Óe{8j+JfvPs$hSU)rkl?Šy,ZwƸĭ>j! pS35SD1rD:aEaJ-dDTxDA$h7F#ǘ$Fuևٮ[~$+cшc_( kDk^#n,B h&uTx!0ō hj !DP C" ֱb-m(". k|0"ф4;$tL1rk볈ڣ^l-YKՋ^^&zP>"SSKWQEPڅsAӐQ׋ЋǢqǾPև{4@PQs54^{Vo8EgamAp}3E?Jcg:ŋѰ40ֈ1y-҈k)!,Zc +}ە#mb{iT b΢F%t"R*P *S_+s'9(x{C+6E: ZTݡwyPo.Lj4 U J1rK8+h4M]^|]-g,-T,GWzPJd>M>_ ؑuH"1Z{hm!sz;@v$;r;ŵ sW ~=5(o7+Υ$1/_][sF+|sRHsUqMjTl\s鑸HEd;) Me#U$`i {nw%, Zgu>{fY)Cұwҁj|нe\>F0%~<[oAiuֻ϶;>/|>JOk")EܖA'p)dd!dm F؍F(ܰqz!dR$$S NdKZVt( xc:F&)Z,A*e\2"\+IJbg/Ǯ;wFaR59lsY^Ք|<٠պFMvlLW-3O\`OGC&IL8L02KZ$pjEfdtxQ<ԫVkOMdDh4 Ŭ$"{ GBJ,&dN1Dz ˱ѓOD~T^\KiU.,&VߣV;_ki?1m.Lp:hft7;|6uXWmq_l"ekչ{|gwOc.ZͯF٣L~]^OvZlW.o Z@i)p~?v낏!I) "eWV8aR0))NEoS||&y{G +^ћ0cpRsfEF8X":0`8ϒK+dȀѹrx6t)nޛ#Җ+?E{ꝙxŴʷhR&s,"dʕ~5 Y֩o\8gIHC 'E.୷A:!WCNstFΆ6BͧŅ^lX;* $ZYc[mm)ڧX5EV{h`yǛt(_[^%m:[nMLb)R* bi61Y,fI2g}i+ɗxT04%M:En4rcQKft4X^f뤊ΚҺ= jĽ< E{B)}L3G=n&(IiFL3SY&r+קT]d Q*/ 0=?OߍckkD1-S1 "U ,1\y > `l"K RUB@u4%0CxYfoI K.&)wԆ H #LIw`enD \Gib\kxb0yC+08' ȍ$cAyЗG0]v5>Ln$2BE`倬yfU<<K@q0N`-xS< w R8rUsnjLd:)aAnzG 2jκk2ZHűs)+b'ܺ0OxΛa0.4N}[!^~|tNc |7}}KV08 !76]\8#dzӃ8<;!!y{ߕ}_wn_=٫_xJ{z/i:h!A0$Au557ZXԼӧn|y;}y1&|4*tZ|֛ĬN$#X$crx"PX >yF"&m* |Bgp+?9BGǤ }$f@ȃcd>d0$ɸ^Y,IEzwNUDg08Zwnf}},@-t1k0l[&K ҉rQ e AFFR:%"kAmL{ ǜ)u:w_bE?'W^r張U Z*irrSGZ߰+qӴ6py継ksiLcP(.G1f<+cÓ7ݶCދRH32u!O?~,"l)HH?9AMgl^`e'^5eKմ2#X]_r6ُ]/<9͝U;hEn@أBG;CGML'#Z6fp\YD\u^P?C]RF .Nf-}Оv W"׺i{wxf)ݎosUX1I+͘z;Qh!VwSw[['m*@fٖ;| 'ΠK9huuYIQj>a+^TAp_i`oTc1`8@Lr,{`}B$tH:Y[?tjڲ-߻g2/\Pםڻ\>)Gl$= "혀+O*@TชJ))$1^x;} Kyh<57о7f5c:ojYͪBWWg܅)"̪,0mWI'^jcM4IEDٷ`2V 322 T9GoBpr0ȈɑMr@e])6BH` ޅ$T)K c9D;>9`Wĥ繑tz9MkZP͟>7'Nu2USISQNxH^BAJ+2#9wf~7s-i,rµAedBi,Ęs!!Rt NRd:'5FJWZL@g87 +K[#@KȹQOyzI/ Y^}@zޣi1 V-{߆…4t)sh923T9LRd3TAۙV7;϶!Kr&w9Y 9p _[kJ<6pe" 뺕3ϲ bd+Mڰq8&tid6[[k;ѱuFΆruwqHzE {6X\~ Er, z9=3zqԣԶ`ؒgzzUO=zr;o?R\fL2P1c"!$Cdm(6}/)xG P_:ojr !jJyl.-T6Y{hY^DrTK/_tҋNl|^[BrH$ TynLX8ʋd{y pSk0c?4EJB!rp^HM]D&5dr3"ml?b/K-"Mpm77gþ¦[2F 6@2zj5|FMtJ1Ť 5[هgSGfe쩯.FaU9>,s"Ym#,Xq] -6 fm&oM|*fVY_9WU]~YD3_mۚhϚGٲh$]OTTMYKU3 ڰ V%IkC`yʭ`9vrh6´v1sha8eGe[7[x%*Nx"om3-Ə Ɠ&t$8Oί319AOKoWzlAy>=;B(=G'8mߴ50fݙ3`~(ߏ Jӫ'Am]` B4I?A[\?8Z1w23~]ɬNߝn`:~ fmGLvx坯\oֳKv-.]Kח)}ҧe<05\U'GCɴmهj!N/<8upq\yXΆNאo%Kr_#wv1-L}q֟6'^ѫѻOw\r|wsltla^fھL[v]zaǦXN*r1U-rUl\VLW/ܬPO3gyYO4([ZS.UnCq^Bn5nۤ]XLi5%KEҲk-*%eETj>V8 &5AM&/<9{;/͂,~ٳQm wn3O\v^55sQsM,qڝ)E1w)sLi?nt׬qau)q|{<>ܽۍ<;=VוcF,y{+OWH/R>>oPZ5MɭXLYTĕu~Y9hg?w<6z[Qc %rJ+֩7uYI'uYTe4ʝ)tNf()A%R"Y_-flczN{=Ky Dwvܿdܹw|>_Dɛk곹9pO3O _MOnY4 o^7`|;U+dp Q̽ho4k}n{=xycD^L`8Ts߉zPڐ.0 t&0{eлk)׵vI ޻ia<^|~i78듿޼Ԑjہھ{W\^\lw;߸ZpgFn#|}h\hۚ:ڐϷ ;ٰP4ڴm-Fc6꫃%ub >kŴƽ-۠3qC|QF/Qwo=dۛugopqr@پMcp9o-avsԲa_n$ʙh"p=99q }sNJﷲ+?^⻉ډ\\jBq;R-=U/gPT+w;n~[fyqQZֆvR>0Oh{qf1ncЏ?~V&SO*Yz\O %sp4*.C Jo4m0?ۇ͂C|M}dv!Ѐ6?n~k.K]Hx2?-a7GSC"PJ)u<aMV{45>_c(9cO~gۗZ9Ò.ҽµ}{u2/Rk-օؽ!2d#[4czICKm{C&nm D38f2|kfNMKk1!kO0၈&3gIVg.jD@\1GĵO,7C4ي0V,1T@0!6 4;5(9x5~kR#d;*6%[u.֬d"9nD Rg5.p,JEu4vi cm.PG@dD˂whZw9dR.i,ǂb@' v,*70:jB&p:^vyƑl\\4 ( V=XbՇp%,dp-./3c6r9ce] k<;=`kd+Yj-4kIa0TAtV[t6Ei٘ RLP)UhbUZb⪜.VuXS\?`k}u5Eyb dCLh(pu BH8)/i, xAAq ꮀiZ,3T] ;Yc%e &ڕ'@(TT +5-3C@z 4vE nB?%Yc1ʣQࠤ`g*X2)kbU5)d*Y1`(q 92jܬP5ȋzMxgD)$h~G[m()hU((14 ]޹]T2/ݽ"p ]%NX2)D=ukX/[:f%%%ҩړ8֡*3hB|BBE yjPiDQLԴʨVeq*[CJ1Z|OxI$(Q"W&حx^ 16X(j`4sMe*Λ6 j2th8ѓATD6?DN㛧0~w uRSQJ]t$X7Xke &Y1AX;KR]brbQmBe-`F^\}i 0.B0a--Z%|EhD8cw ԁ2t *kY`ml1b\GR=4^HÙ IZB܌9vEEAd Q3&JK, Pq& j@!oo+4\v aӢ4,B݈|,j8INhՃҫ?X3zlӈP,T!m\+w[5 W2TD\XP̨JX9/^m.9U+ 3ƐQ`P捍3v)֋U{^n^k CwH` >}l׽uh8m9"l|bxY:7UԎ{2ٮq.%u#) ߴ(X-r*#ݝR,CS e [U,(+0C-[zBuŢL-[Ir:YPfXu~1CNƜٯcD95\ax^,,StdiqrG[B֚.ZY$um[g[椒Tr"ad"g-"e%1%QvjVs"9Voj'd[_3tq>YYxŭ] x㥰mZ og FoGo~x ](hOE+k'Gb=w.9c:+2!Kԟ`}? ;_InG/vt=<ۥg/~V OS;uwVĤ#pecEgJ4\s4io< kJ0jfW( aY-Bo&雳^ ,}֜6᷆+T~քb5QXDڅ+U5vsVqȻP"¨;I}'N5"灂D[p+Πz!h1'9E*IXצ|UiJjP_Tw9h.,IQYN/~~gqyϓ巃u{e1.^4E<НCza\_+jhVUA UK>n&I/_-B-;ʜS)W7-ܫp3\G=pr4./r(xP2|J2d`&慉 8GoW$}7+d"/9487n~]hV+VaS0+7^2(nzwkоJ~;-sʥl{ ob Q/?)cp t=bk.{뽯?aAhhc/w3ͥ^6;%οq8~^*z[~vky>gՆeF*ŕ:irǤ\2**-m߇M/ ݧN} hwWߙ@8Smow:w\9x"ncKmƀ)m@'g+LŅVۊ\wr=[יoƬ!8Fi2^rҊhmݐ\Xݰ[A)jޝ~ {~\RxגjsZ̦ƒZVNK)TdRY%kݤqkz%1am c)QYV&HYI)QK/*1՚S8z\ np9ȶ`uFIJI+I#1FqN}tV9".2tHY) &e$dof k(i/hR>ԪA-Z5gǘ*Pl o߅،ra!y9Ì6gᣠXq 3Ql,\&X mDm9'jzSNF`}7Pñk&'kܵJ?}V%Eo VRShlG'_l7a)yqr`KϒQąM9rj8&E,f*MZԌ5gflViM&B޲.d T[T6)&MoqU6W;0x˟\7xN5YfJ3ƣ&4:p(`ʼnI9 %r"bsV@>uL6Arjf %툱Vr@ЦOm5g~\r1Ek&Ǣ9jm`7I<ek4I1 p!fK-XnI"lehUf H*CːrbO<#(|̲mLε[kv}X;4uc*X4bkqF-kD5F 3&T}47Ѵ( `[PK"ՑKYC $7.gP ?K.%,'p}0"TkBO ͝odiY#֜UU3i5)9T/Dzq B}T>PS""gH,A.1p %CzXakq>-?4? 'Z]zܺxO8qFQG;я]:JEݻd};i! `rؘ,RCX0a |s;QAw&n2YR눐*N$#U,Z00SLX/mӈݞfQi7gV;]X,=K}jT}a0U3|-ΪnQ2' ІIiJk>WY< &cqdsWgL(~{d~Kj#ȏdGGc#Ε0ӦѿfŹTd6%l K!d!M#x*8Y:U@IR6o7bt9_)0yNf6{÷#bROEO í c``GBd<)[*RˉLSMY;=jE1FŀNJ4D.\4*y(:sr 6`H4uZ\j6IL"@a6j əT{*EuV5gbQ!ʐގW;1xS[][;jl_"[5?S5VHgxReϒU;-ke̅Aqdj: O;g0\ET=kUjMdfy$TKb)JldgdG#χpc91EKr@ӊ) "ofnĎD-)g'? 1``S!dr1ȲQEPY c,Yڬ_l_3YkGC15j/mϮ7pv(l^jC zXܨƃ6M>s5ӟQ{j =-Px `tFNwRG͌'0ŎkыЛzYib:%A@j(ॗ8BպdHxؚRy*wLCο^Zp%xx E'#?fsF<̮}|Q  w9fɥ$D~7;rxU /:/_-U.ET^ῦ_x NG.||?lLh{ D?oM/}zA [2\ם__Sr 5k{~vl>Z5",_OXR6U4J_D}i&UʑǂuZ/L niz2LYToԋŔ 0? aBܨLdwEy9F:?u\Ai2rK+XBDױ.n>,478{ךS01mll;D4WKI:MmbJy! 1+(0V|oNyƀufGH9ɵA52Iѹv`:ρK"yp9n 9؞ke֙`jjm=/p˜6Y.̂JȳU`A &3)X̲VmL(,I0q u)B9C-Tsr,鬨3Lָte-a&̤K Ώ" rˇC̫")*$pɱ̽_әrNFg°LP*HN 4V1mb9F1κ/u&$VL:]'!̷K:Ea4 c%5:l,:PYSڂ"GOZ[[1p_fD}FU<͈)IfJ2ֱҀYat|J5!`8vO~AcUI)Do1fW%u<S<,'H:]DZ+~!qֺR9?!ìJ#5 %ͽ.@a$8~A(b=Pl Lfp,ӍU 8.ŴU,GxQ hI2NC~'$ǃb{N&wK&K&ӓMO28H'5NݧTᖔbads:GFKeOC3 "x&V3 2FaȜDf,.kڐn$GMF8ˉs}8ݝ|*U ^Wg!$]}1k_^i2a,+:qh̢)%`ӥ+Kk[za5b^?3կ;o///^ g &C†Ns{t<[nv@B<:[6KGxeÈ[:,ǒ44vb~zÛ1݋tTue.]IŰ)+cyuXyW}K6b~7N)NZ)_.?짟X>'^-oh Š S&?h种Qq;u<+,U6Zb?]1!އo^߿o>|#~ \.0,dr)&{pg+vm MMVZ2nK^2o ߬7X!Z-lyiK_'fHIF brD"mlL 9 J*Ygwzc&ĥnm4:ly΄T}1G :"O! 6ґƒTcuZlUDk0V+y}ksSbIέE b.tiBw#aBwn,KCҕd kJ`@AQeiymy0ٮ`{ &g͹|<>HBK)ͯdiiMOvT YytPsM1v"3AhbѤH(JXN2#mg`*I5>{#ҙ8c&fb,̒6 J1I9`}@&b@7w0S5fq89}GO: 17vɉ+#wolV@Pdܽ1}#,k%SgEI.$c7IP 997$9H,iyȝMec;'WI&Og<>&PBsun-$ň z^KQ4waԪVˋj Tk" e3!c]^wG~OT|,!f%kAWE5hJ)̅$+^'D1 K*S rm߻+)Zg$ܡ@; Y$R6\>Kc̚yFʆΡۇ[ӝoUoPx@_]\GHtj״ fˎ~t6Oc)H6&z-W] tSFDZh*e;I$'>bE!P.3: b_:+'W'-y*^^t{r$TCOayz?{ةhFIA7 yh1F?bֈ*Y}a,țь>1I2N+lh]Tk5<鞝_ca< 5`Kĵs3]jGyKwY_ӓhKJy8zm5/G.F)턄Nzpy>jvӤ8~۟_>Za9"Ji8? 6 v  C*7Sَ>0e]F^`Q<@pR팂:Z+<)萅F ӘÚ'E<IN,s76̊ymXiRٴR5`smXmf2#Gk-Z".9Pz[Lk廨FRVc&i=qP+)Z))ۭ*c1& DPJtmSX#&k#93Fʴ5/[A ݦ꼕{/9\DŰ ) H>JR2xaoS lSP @ޔ\.K%%"g$euDԞ/z;=i'=bM|i'L%Qa)(i+4 D4WDHDF2c)YG+c|H"RlJAg;m@ie)p r ,gjǍ7Ѥg֐%z-|eF]=tH;Pșb\ x dQ`u6A!3w"zL:ή5!kbl˷6c2^ cWB-~=R7G Q' ' v{ñ|ٍBQQ;j.R ugHP15ƔP%U,(ᣋ{p^X$ ]KBoG:ql;np4>s%J&طHTм\&8 \rUĢгĭNZ) *Xˣ3[`Q g9w5mJ_Y8A6?r0Ҵ о;4) \b,KV>Ve+[nd ,!9p8>ªq(WqmTei--P6}VeWV&<Gh*0F+oE:q (] EMvW v!hT2R*&-LSgs5F 2"8_+/e䩊x'[cm;8 =s=cbX&Ẳ9W9v9秣F.i8kT(6([8Stm(w-51!OqVhsoB/_n *>"gifSFej-bgJw @KHN83*YLb`/ ]U$ÚNMEI#1B&z(3:<d"N=d vTtp9ئc2GWSM52FM9vf42`)#7o_{pC[o.p|Yy>ٲVlzF *KI+R2(d,o:z@zbؚ nx:t{՛]6wZ>|i -<OGMǼ ÆZt[zOļs%6WG|fK&Y~t`bc3zzd*2mTACƻCav'ItۘuRterɴBK"㠍(,k%7V203cnC8@QmGv6_Z3!_Z+( &#RI),G+t E$Q.:Q\ ^)QF,CQ.)VS$ʈhA #(0P.RD[ŵFΊdָ|0_laX׊OwA֑^ͱ`IVV \NV5WG/E!J:IO#EH%簰@ g#AL!PvkOPWHGcTMRy|Ԇ^b4/y%8x֢M[`޶YKC+-;۵MUh`Oln4Wy`B`Ç^2*ώa&&8\4`]hH"Zi 3ݼg3XrdpFǕu )Tߟr[}[z$+`/A3G{9|9LrO!l\9p %9asP0~0:\s0nz~0%IWzdkW5p?Zm޾ <=1x3׃OɩcT53 s 0*`[L54rQi,U&1im$B^r$JaSaS`BpZ?Lc}+rRDW1;>Uz /GIk3eiyEluƥu@_͏Յ/mR(: KRWP⍃,mr+yk0EblT¿;Xf쵆'*iwAU)V0 2L3jVʤtR,nR@ )Gr&)L~i0Ƒq3uy΢`f%1k؆bע%v j5+*{ I 0GM,*-jnDh:vF6xNy9h*y>ڝ D& Gn&:}0Rx7Iڙ2y@ #b=(t- ]xs֯8٢ld޽!Ab9.J!#!@#q- )ƵwDDe0Юf{'ԩ]L?=lX[-KP:_U/ZIAœ!n騄[Ag ő+1V`:9+:uxc(bNm5HwD@4:&ѩf̰4N+ TDI$1mq_+Ua(/+J/U_Z l.x8QW)RAi D!sOcc0V;O{wx%$ΙA`ιpESa8:v<n0Ry;AjWֹclZ) R8AQC.2 산z^`Õ` .=^|p]ӆnv~>~= =WHt}b gJ0W)(wbIqIWCIq{(]WCIs Uwȣ-""v&O=cX12¡ bLQ ۽9(F ^ǟտ@! H ˥CRjV])70!w2nz(FUq<JyHƳ3{T5[-իV"RPtYgh0NA +@JU)PqvkL㧳^?K7կ'^_og '\N/v;gR= ZQy~sfzC'B{b|qOMݐnjfY> {T*1]uo.}v.:ecWZƱj"BR$ze$ú}~L$_1KC|~y_بN8'd)8u)\wS_u,}=KV$U3_%9/J"ݸ;*͖ UV ӫ>\gpB'om|wLɛ95}XQpD ;]旯Z5WwM[kJ!O.|($ 1^iFT)$I 9%@aV!HcQ+mDyxJEQzsKs[5Dӈ 4MIL#V#E![/ ,X,-;i\g˲?kڧ;>ܠ'Cnê]ӄF4LRg?lx`@W1E$#۠T!4ERcib A_kpXmHHsyh'FE@c":eg?Ald3z)β<\_ BR YTS&.xQYndzAY1GQ/_N7~VtCd8ҕwL/9~.(upYmݎ],S;ZqX={arI\jxݞ*ikӗ^_ȜC<|H*{, d.탕{v%ˆ) ^%7@UN(*u<+Am +רԚtZK ?!1^vR3Tuz\0:y|&`40)/Y+(80>ޙ?1mU'Z/EP.'y`" Z"ˉ_ <۝ V/"<%cN"%MٯΜ\>eǚ0NwT.(>dhS0ϊQ5ݳS;.+ѝ$p_m%tjn*Gf/^eٻ6n$W-Uw*6ٽewSdR.^2c䒔!Eɢ(ː5-i~4/;Xm#~uFrn0r'S4 \@M,(s(1y 1 o!$V4 A}_RR΢ILC'$w2O*ldJN|c̚y kC!w76%v<"fL⤵1rJgǑ<T5O)aOz]l,ur ɘRNj)1e]\kJCJ6#[&&}ɷvڭݼ02,=5N4 b"hz!Dhwq>Ԋ.r)]O$M_ CoW!D3K?L둘w)P7?66q6il'ȥ,9z%+DOugX 2%}▛Yke<11.Q+\k9C/ 9 C$e|#je6Xp)H!E)8Nr& | OOu%jqZVS&~=^?LZa%' F3JorL z1֭n>{dET ]A-M#DpȝCa 8 ϖEDKqr9śl>'|r|擏(1[/ߴ{DL^| NU%R`ݹ#=5j7wv%sUT>#s.c@ ^$-QH9G!;#()UQ[S!0< 6eV]cdސ@I j#c5q#c=R i}PTB>`bNlŒ,d~|yfᗿiдj:,KK{.f_Pvŀ5.< L@0*JSJ?s \5ӜS^:%*f*!3I^tH#OoDN#G,)"9h9S_薓㾈*#q@[S$74l)368QMr2yGa>I"UEDg1m$ƨG%T&"D@KJY.Ib29DԁpqtէjZr_\ԕqQ 8m/RI(ОfJF-$x/EF!$4Hj~Ňža5/xxυI՛_c䝊~3W6ApCD?r`я>$tv=vu0Hk9ACS4'8@zZYAޅAB a\!(D -ylӭߪ|Z*> ڴƼOMJ]}:Xp1g#ˤ y <̹ѰJ!l|$[L4ۥ.NEo}!mFƐ Q OF#RM|#zvPݹ־ wp(o)6ab,ɳ6zY(l-F̿+ M̞72sh8D|@~$G~pn\fy|;7k!V,w%, Zgu>{fY)Cqon>{F@nc: +zvzjzKj|}3sY˩t~R*EN\c"/۲C DinIbJV2KJZ8i!8s&Xg/G^sf!nFv/:[}f+bqS-u層{kZM٦-589kb$]L}$Kovm@mB/u3//L÷m˷砦%kvl{ݳb9_|:>MfX1VпRqy5޻.l2#$!\/~ٜq }c?-}&~N$#%ݫ=͟`H>;^t{1/F)N~Ÿ3E|Yڽ5m[wzwoWmmuh*Xxyywsq@R Y9Am{sy~ɻ.]ImvjMp Wndod_/ j#m-?DG7jS~~F&?-iKǗ.VR_wY[0ٿũSbNK ><"OGo5@5RX8b(>4#b3h[<'* lty,ƫ~.ZkF,'@J% wnihzq(:щ2@w'F]Qj._Ad7ܾݻ)}iNe9-l ^Mo{Ÿ3Wʔ~㧥r|:I/d<{?q7Z]i ][}/<7iO76D#dItEFs:YAGQǓaCB2hjh;KFX?p=etxE=Ns4kG=A/nbE6/HS4yzz-E?~PjaKHGbt4%EF@Tj!!v_荥ѨSC M S|^jOIW#ңvv}wW_ `'Xmeza^ OqoR9S|V>MK &$atS{~gŧ"@PYwyj:Mqͭ.-ꌾū>OB;2%e[Ns"MƋ-+FjkXԯ^>.[$fCZVGtl-u X.6ʁԵ:G:6nXmQIVfw-7&w7l`ݩ;ӿ7\1>,w{s[0Oe-h8^A:rg(pA wU`Kȥv KRVaZgAVܨ}OKyrrэِ 5g/C!/8zt=^+Y/)&^^./=V9}ܞ㚣TvaHH: 0M R!1GH^BAFl'xf,\VJ[aJ%=e9CNԁ-,}P5ȄD+X1 A84J1_v:%%ݵL0Q[cI{΄(68熣tAzmHks&ޡɽ޴)G$]`%H(D܂F?@Mx*p?(p"A`Md]JD$JFJ] ]TՊ' ~xħ(]9ey oM[AaD!iP$<#W`ZګZ3W6R~ޮk 'I^C8R+IrJZ`jI͍WoI՛!ΗؔXЫ_5:Sr) )[M1 D9 d!gVQTCQXS/"EXPGx[Iw[DX]MF m#G_f7mY``of\(ٚȒVdWlX-YQZIn6MOO% !}re `J9B8)uAzhd!Dϝ%͓YOօcQOjΎ) 􄜭K[KDs1^ڜLRi@|!̙p.iJ"Xc\;aUQZ $% ڋ@2 )hCz'cZ{PhcڠȊ|4#\O>Sڙ`qaxTR5W ,ҌpBlDbeȉqgE',q hهSʘO"iaY-%҉s5%)xЖIak޴&n( x'[ q]owlL{"ǜ+f[w-evBdB4\;&j,^K%JFr8?Q??ϕsn_גwn8q%VB0^@^eA]O񒯗۔t@7G/1HMJj %/-.fLweu=htxQ8l6[UN&Z݇g?ᦤQΦ T֖i?R$2Tv|@,V9kD!ڧY!/rx$Ř /OU:*.Ӡ^(њS@9Gul4x;lȔ܌ϸhY2yɽ7CkR$;w{VX374ܯú-'m6Uxn[Ex;{"Ar W/- %e  EZ~JN=?G~0Ǔd˄N+21)%IB.)S CmR4uI:q\O<>&}킧po7:jrDvn$dp@"'1*j[D HQ &ijə脷ᝣ/Av Ͽ106^-r<.ߴ2xܗbξy"FH!lM2I-^&&[l]Lr 3 ]M QG(5;S:|gNPў=v;{ 8#(3 BBƫB\s:܆BJ0Ȑ#V]kUh=$Tplr:@t>J*zjF)R" [k3|JAf?8-0vYm'2Mͻ_OΓSQH:U'«HL퍦EV&5_ҸlFPkn%2_{f:0Z:,QQѼѦMЗB$V&i)74zM2,`"X8̤R 0)/Qp.KYf- ,&j7mm;Zk:%'O|'q(E>O}4XZ>W}}>~-`(ǵR$VT-d@ʳ9{ةy;^miH6fC'r)6!8g,2)4!AP'.ǎI S*žnAMB_ L3љ "GHO ~bmN{̉GJh<1ȜO::j^ٗg:H>|v@0ZX: z7(onN&iy8ѢiTAՐNa@уw,N {b1E:( [ۘX;.nSC34}z5YkgԶV $1] -(hNM&b>TZ@]n J70/8¸)#5Ծ^t@-Z4?Q^9;9DH3B6:RӴE]O^;c7W?}p?O2ࠦ;j:0624`5чǍYBoXCya[d^y[epnqvzE7v邇uFnmڤlt4/YxNmУ!m]rSKzVz{ԝ#.<bko6RɚJٺ{ۺ^ͷh~PJa<noqw72I!Utg>Rlju&+_?i95(+Oezԏ6vƹv(>U?Li{"OQܹ=mu>:'G}Hdg׆_l>]֯gSxg GYt2kXYsiC: lV%s3iQ| ;٨%_Oʖ՝0yH<>~ֆ>yߴe\96X6ˌu']d~wv(M5YKxZL6[lӔOh^TM;ҒR5\ܭf&z<,6fۗW3-t=g;Pe;/firT&x0~F+ s{f|R-"Zp^JUV1T!-zçlo"-7 ,F3[ R6I8 VJ1~3ѩ_\rE-3>xA F$ZcȡșGUdLARqQ"ySڜ8\ Nl@܍ uw6::ءËuq/LMb=ϩJw) F@WRXRuJKTݞA=KiX%nCutٻQgߙg_avDm]䈫W56KiRIf4C8J>>uRi0q@浖 jmP9x]Zw`T(($R6,C7V zr6F#KYPћwLhhQBB^Ч7yI]vbO.'$hZߒb>qyP;nL[09B>ZjRZx";~vT{euDHp՞C&U($U x iSH|,#Fͼ?{h~E C=~9 "GϓdI!h:Fl<QG˲,lA(i9Is$tr9LBMNO%5r#0lY[k 丐 g+_g0,Ȳ$.X1z;~kFuμ7!#eP{vy0Ύ<O܎lG~rȝs`4>lדf-ҊeD'ML,M': Ō2RpBԡrha0 :н'B>=,}Qni ͸$NJ!8aȍ&B\HmپLt'JSlFih2ӸS bx3*Y !s;M6c"'L%K:q4U]kz|27BrËRõ5J=|)C\1)npq5t?UFP'kyɔi+wb[rA9_4=*P*xO.FÀ/1 siZj/Aqqs3?'nPfyI')f~|a7"6(2'EH\_/g""Vgȵ(t!9A,֌94#B5 .J#4s_% ZBG B{S!q~N 6y1PaOq)<{9GJUDᦐh W)!AD|;y:])\fY,,uMC7^u~֮w^gӮq~>9ԗxZ͖ה_i9,GRtfM g]WlZmh= !tLКLͭuw`l=a֜Q¸wl!*\Xރs$iX:$)*&{#T$IA!"Qaā5a+"F .*_ʁ̚O܂E*$XC 3$25DGGůj|]Є*Mrٞ_)ry^ `=.djy8D:n{aF\{EÎL0M¥1j=k HTt#C%ǼTD1(WBݪL6ԻDӊ;/xUН TwН3K=pyPb:5W֎)Z࢑1~`Nvb;̳Z`_F82!?XT8cFo㕋bRe{Γd@+!'j$DTP3"$4&N[ .*NV;}LbաmΪ:D6п)Tΐ;TVpno_Npt1?TM.Z=:o¨j1^T5|]07w:v\uA^Aup<.~^+۵%ig<g^xV]sܰ o:}&}3%2w={Twx),8h,~ygqzݫNbSj*[Q]u #M6ȿ7׎r[rVp8cnzq+G1cT-Wֿw:?L9?S[ \)ucBOS(K^"eY8zLG(jr@2$PF9)**:'f\"Z'ARh/Z@r3J~J:X2XxȭIX *Aܳ[3<ߢ zO׬hjVj.'-dp_(&xDM 72jU8#s`GC{99b1+]EvDQhӛ3ï@U"O@*PtVpg)Ib#'lhmLrDɱ>+vt>guYZ'ʗlU6QuF)}UnW;Eg\՘kT.dSp<D$c*$ڄ=Z兄 8vO0]ԟ#l5-w]cO3ft[7U.8CKxŘԌ1WskdmlHuۈS5uڸ-@jL6٥xϡTg-մ=.hCL$`!1x1AXjpBi5yVnbjgü{>ߩV(.Vo{%]a59뺊]!GU_&T{:l~+lyj:.omi9>6 DYk.qEEb ϭB+~AQDRr3DXnEaRY2JQ,HC$7F# 1'%BxT0=,af¸+|lJEaqO,"I:j,psC4'(xAQD(h(oDh&  E)@Tu(8%Lg\p^(Ţ,Y$utL Ej60SbRDpo%= jH!08>.n. =,CUnh #!c5$;DJ+\`)jp=C*"`#JN #oJo_N3} r_|<=1(=uKܵD7OPy쮖d]ۣ1*6l/NTB$] o9+`6, ;3X2AGY4'sOɊӶ;8%]ͮb}U$2 L"mMe^5X9^%sGNb&Y1Rbтf>O%]oh/^@Y^DB4NDL:'+tF(S*GJ*D6g#N%w% i-0eն Jmy5)D ܫk-w1͡}(:a$2(˪w!K6/ WVH;UnVyU E!ɘ8s1IlΆ KyZes4-3egNKɂ7vIK=Q )E`lLdUJ9!3( (oHrP໵KPN*i=&Pd4ˢylYP[K7V~u&ΎvֿBˊnQT/\Yǐ}Z{"ÓpH <"NPSX$$tNM÷ˉzy?LGU$EpeBF9F5Oҩ }IA4}zU)|%:8J,HњiHED" JS8n?qt2qp~>1tAx::jfW!x'k8PE "V5ñԄM컯Y $ꎓՆfglbigk\m(/LB9)L(([QR}ي@Y^VhdH>eoe7 h z+;@RP6 z+ˮMefS9i(iƒpb$Mz@lEaY^["ID.@yQ%Yd>596B( gP t]ͥ3q>@zZ}v1] F+'&3H#!ѠTE U<7n4^+ZE|Tnt-H5*Œ'kEԵ #'#ujVF!8@ 6%;ٙmO݈jn[zf{hXiG7`N81A~N`#cXD,Zm4tJk~l?1- >[ 2}) )QG\`,Db$-C9떖 E As\*E,r҉~& G"D]uJߘtruYuOLݞӯ5p [zJBi8+OGVlN=:z4tDnb6 -շ L$@Sq>$?ֆd(*=&zV2:=J%/!MTW}f̚9ku_u a8Mh30)r0Z Tg늏0 WF7=]%ohҦD̵P,zsHѫangMuDj.Np̩]2 æmvLlzgt4mT)ָ; effTjht[V-z3쓮}qcE#tsvn!skkI[=o9lW~QLzD`26caHS#se@1xd1DxMuK>Y+1N B*\!JBS(U]:A#R&$f^5E [tZ$yQxH$dt#u\y;YOٯxK[oH3'BmIhOBv{ϔ 3A%, rQhYNjh;9 d ̦VșO0"uss\?H` SVsu9)ŭH^J3sp"Xn;ĀhMGQ E"X!F! ٻ"T"buv&Ύ)m 5x%22`)DT H\ZfY@z)Г{oعqw82޼]Gh5?P{Ot Y354T-9=g$sUV|r< s18KFP\f^dEʔ:zg")MoL]lFgB̡L QXzQra#Wb{\Iշ\܃Frq\|VyR;gRٔh=f[-!+ӱ Pf)bnWcQe$-64% f c*-m[ In :gL?h1vi:09>Zm߼׬|-+7ǧOW*^l%^k9>IOQ1`H/kY1uPרkx>KKT'c31Ӕt96Dc}e ;Ʈpxpk*xuG!$;}.AɛKfo&%xRNOosQIe*A2]`XH᳐̀RAC,H򚗁z" ~un _5WWY'3:+ ')|RJ+QKcR/u2<~?ޒ+}Ux Z$|TvͿ6dBJ z[2<,af''Z=eR];{eKk]uvmؤC0j%oR.}q] wD*9pŵb(eiɢrp<22U2U)SCBZJ4"KW|H)*EsG/rF+|rm}ċI]KG2`ڋ/h8#m`f):;U)c3TO)LhNzHzFHZ&")wo"0+x V?4K0ۨ2ɭ7 ?L:y۰Q߫n6k؟a4fVÚ7ʸZߚ*bgSÖA]m„^nx*/~~ g4`~n;ގ._Uj;T^аo{8%=O4d]:xR's5Ϛcg&7UB(BOdN za*a;6ƙI0JX)bOim bVۛ9 90 ޷[Fm $Vn)f1y줇w[4Uy jxDžTsq8džB #a,?S g8l4{@AIt]k9<"5KZURLa}2& hFxƺGdcaU_߬JNVrL=/]IvcPz¬%Eu#dAӟ5lp< CNTuo.BxR[r}H]9va0e'!v9fͯ{6ito*%GəZV!eL.SklKA,N'm{@5b&y\Q}܇n).tzAi<M霽V+"4R<lx&c)iZ/~ECҦltJcp .x0bPB‘F 5@8|fЕH…xB/)lW=u-_s`PCY{mLWl=*%M3 쒱"[6#77nk/xzG;V]޲0_޶ -P>GS |mDHӎq>f/S'-@L)2rausy$v}uEZ2GKߓq~9Ϙ~cƼhv{LHK:#SBa؞d^QO}iH\0 Fffź`k5}9v:M@N=HOI.<tKX5.'QV'1(g'{׶Ƒ_evؑ|;MdG f cdHʇȻopHQRݴF$EqF==U_WTG(]wzxݽe@vt}:$өX RĜK5 @dn+P XU~R҆:&4z[ҊQ̴G4򧀃z;TE@wt"8)JM~Eg+qvAR:At׭Oِ1[NYO6(*zbLRJf)*p4:$J2Q3"1]V)ORPBd 6h*Hf A42XV)$P,dXxT,vpq--e˃{vJn_P7 ܣ|#6hhRQʂ",:T+Z(s'bM0,Af2dg}`œ<&I5h(oGIU&^s?b8l]Abf=jμČS.*C^xn}H\E -7GSTIIaH$i=d YD瑆HoG4c.J"օ$TGema<,6qWmQ20vǡ( #"G]π@D74IGtJ9Dv &D@gv6 ,ED8QpK8Mg\p^(Ţ=i L\M~D|wQ;vbŤP\qQ.\.P ${ vn-kH!048mŮa18Ua<R#'G0W7oH.޾(6d5"%ךh=afOٕ4HP$t" Ҁq9EϬAō=K0p!)jay@'#i0h:M&!Og÷$nvsuRV:ۖuA5(7KƄ$Q'f6rkrZ a$q#H뢕21)MpFE| ٔdtҏ&Úsrw_nϫAƠF~4lN6,eUN8μmOm*OeRvmgՏw ;5_d| rLfg ajQ;rpfMQn\4K?Ӗ oƷYgHN49DTٝ D5ós7Nob<j~g536Y^>M?_ׅ irxpzm@N(qns6xbbg$3n>pX>O-bK.`VDu4H5c5͈Pkh 4:ilI}%O7jQ@q9-VgTde`Q|mG( y5U}dkS hjz5/>qjbk%z)(^%w;fvZw{9Ώō7Kۍ{οlPm UdEٝ1~ؗw*ѿ-/2yVryBSOR8WH %wкکߞtQy;ot[_I֌?L_R-A?-oΰoq9Z/P_*̏<|;fF: ;f:guչo3 ɄDS͔'DP v н?? ^?ִ" }E'(IG "Qaā=a+]HeN^ fML"Eħ$XC g2|* T=3@oF/H7'Zh"e*w9e*z,.꬚gkjc?#a>>mcW5xC #M%I1#u^SqhH'93ǭ'ABqaLə'v>nd@8#SV?$6 Fb5z.П61YSQՖzwMٛHU•}^DW5.hB&,FN`WgٍR1=gWsx mT?.|rq~r8c2hld.8g˵ejm7$(je}w9!FtH\=Ŷaa8:̦\muQRw(Z۬ٳm %% ejq][X([}SHƺs\I}43(A[!F{xh't'N |.$ʾŬ.7Fw3-DEEKh%wT${xQNkFY#K<,Đ☄HtHLXh \ҲCO+覲H)$*ha2zpy EB 0HbCYfq<{vȢdNЁLLmESrüK9^jbUD7FnzM'71WL@BqW>C dGx=R`*Mꡬ;5/܌8[9@%j]#jkw #oJlo8(oYBѹC,agrsTтr w"Y呻HVlQzq Ve'm,.*NwI {i?+Bt",vg _v0bK:vb[R-'=Ď.UucP9X(szITkKR5 |yKn}z*wV}vZc4 O Be\\ GM@iQ]trP% Ko%(#mCkf}Xlm.N9TrI֤ dNj S$S*o|@bѠehbR[ fEIoH3k]|\C&Ӈ4\Ӹb3?M 6M\>H7{Ө%"Sh)~qdڡXMތYw/K2jz7ex%KEV˶={"'Ig9PIx0sw nTO}2]Zo֜Ar3XbD,C/Ӝ\.Z~d 4pWfKD,C hqA (bNc~?֞s`o.Azf#9Eh1nmлǼiT6z 4A$h޽4 onb"La@c→\6}zr+t|;,bÊNw7{Վ>O9 {r3RBĀtraS=æߨ++dD.ZglD*Ҿ21,-%HHNfJЎڽD{CO?5{8O6 )&39d]$ue'~3)"j]Zj|L )#H8s@"D ωX^qt8Ϊ%Hg<ix3+5$ w UBâM9(I2)Zc/mEDLFE:cCv,|\Ȳ1EIE*%_&OZH7a>x1zP<` 1M]qȁҁ[KdfV kpZ%ѐR͇|mj Q2N }%pMdymGuwɫRN s*P$V`e%73 A>'D9p9r=h>':@2@ C8]At>&!@ЍhB T2guJ.%%`ȃM)zA$9- &\y3`Mk֠cX9{۶O  ģuR(Iyb9ZGZ"d*N> .aOpIk;~-?26- J}ϛ:8yOdtJY]4, T0.maqzUubbE' DZJ! yeBƷtEh߃,>/KS i_v~( ٌ\,؇A]XghՐwJTh]7%7bʎU]atMLd TDe%c}PV=UQ^:$dC*a= "xuS`d\h 79pSϓmXq\iy{UvqƳn9q}.5u*zeʒ-͗*k[ .\}Jڬ-j[ hHB.[{y]zetՌ~{ZlMcvں idE#9m՝Ww߼htǝFnx~*9 ׬z7wtUif%dC UmXy]_?.=6%ʶ#mOiC.Nм~. v`VK7JFRgWW"4V~qzo_||jt,ZJ1Vl.f<,Yf9񳵙34Ni| jtKuy3#XQ9 3&Eo+q*ԝОrx>Ky/]wZ8}Ѭr.Hr$?tFU1XUqcDʓEXp>sy4uԙ?ZahRڤr1@6`DIgbQK:tYps# u%RB K%JK3YT&(g'9[K;jX$u[ NG\#Y%c,E =!m:mOh!,B(?D $nD ȈBE0iCJ)fO=EblSϨ貚QdCνAn5x@Z`c 993^JFƔI) ң,؉U%xWvAƅZwo4U%:?bрyiyW i,;n 75/~qʐ7/'?o2-Ő>t5~>`2}|OY=\q}߿8j퉩]?J䢦L1A | jk99Knɋ׍⧹CԩA1&R>Z1ҵ$ؿC7Ahy,?-ӵunXTY]Q747=b8Z<`QI}s}3rF޽4:8ͨ) Tj+*r@YJtQڀQT(4A¥eD3&MO׃[(uJ_v L^I(56Y0]Mn.u@d%T:eΑTU+ d[xܒo>``چH`FX7^A{߻i3̖{fn;v+YSlyFkD(\"YQ}a6"89'^R#%[:ұ#:oh N.MNs$DAyJ0 Aɬ+a'.S☻/鯟ѷלJι/mcVQLw r)Vh15эROGmBǤ@3(mA7RӓOk%,=smK^$/QfQp_ށi[l0J`/ӝ~Z7BRKKkfDr2DυY21, R\3i2XdXK]$D%Ii3 cf>'DKcJGgG-o_&b˔!jbha=g |MsW6dB3"n⑸qKmiFϔw1>$h*)\l˹T20~Lʅ=a`Uą$XI,c\6̌6.RLsPIY[O\IKZ2ޣ * '}u@RvJ PMeTb‘J{q2UWҥ`ruA"X L͇ɠ<Π;l _jTcA#zPr (1de+ JPaZ*972Ab)ЭLmQx nEe06}ƂQ,TG''"4+JXm))w2<}X P9B(53 hwp!UBGm` @;k1-PxW3K:\V 7jl@ИXK4iQip|Ӂ$e#s4pqإ6FuFM39(k $`?Aj؁0"ep% X0FaE^Y` W2N$Jhƚv/tc,o+,4LkTRp0U U=zˬ Eae4+ԿQn$Lzt`_5nIު0C:[i:8l? Vr)p A(P`AH&P#F'b=c5X߸UaݲIY!|oXuEr\O.A"v9?" ^1 `\ة  HbQ<T5uclep?{ ڰ(RVGϊSZ]FʄȍL #RcZ |,$oB(A I d:USEce u~5(Ws΋ԄMg Tf$HL)#z7R/X[RH%k ij|w1;aںLdˍQx页>[S]EtJyU(|$k+iǘGd,  W@D(c=i4Oc*&h)mz*kztzfh:2˂ Ze:o3/WlISz Թp#sdfȌϬaiyՓig TOe}sk#H,a_ boKtggۙXo^O^nM^b~4֗ymSe멾:qd{;I^oPs!Ӆ(:Vho@ff?Y_/χ~䏒;^~K}^?Ӄ{/1ȿ◡y?xnxpף'ےtI7{uy]IںNEI:Du  6Lhv&,L"ʔqiw*Ƙ~esܷyYտ&1Zi>'z1ϯjˏ F}a+l]O(?ϻ Ϯߕjvwպ.x7gmΎf -͵o9AL_/Oo&yI3I%M]*6Q8V+%V#SїIp/BBeOn}N}ڎZ'82ۙࢴ1aX172} /xd:_ǁeh7^1vkNdf7:718緗۴*Pm\Kaŏ͛ېo]\yJ1 4c }7{'YCr+L,U/6h[Osa~U_&Pz_b)`RZOX}Um ܮ̘p40D(-κiK wJ4ܳ\>k8 Pm}gӯng̈o3;'Q9Evr@g&CT]_̜W9xw?noqE+].&eV򻣍k|нku7њMV:tRmmpWoWw74f?7+;\-f g-o.>r-o]ng=ncGz| ?e^>?\< «9-rUWo|Sv7kKO|7-=}؜ KnK.K׷Z\]}&7{~ljYlSz߾xWߖ١mJYic;JCz?n\iYakq2FC-*vn\U$/'^=jľZܣl-|.jkM2/ٵW)RM˨d(1qƎq0kqsilocH 䏿?(>޼\~_6=O?)V;s.OVu߼ ~w8k6e1l=EInN%>[yG 9kԵ|=`V [r0.:R=-)Cf1Lr[~% .+euv6h+oGwRy@[Ӭ9ǂ"jôˑ:JLi$ubυ=Җ>Χ439ٸlE'SRIIbJ&WuEz[U]JFG_g9uvy<v=#3d:$PdzooJJsN58sQƬ-ܺr(VG,SrwĈ_B ^ă ^ ^w֘ J>~{̺@k] F=BD邭[NJ|'=4I Ggn]WŇ/iKwxu—4毇yla9~*o dVx LT3_Xφ,3[u&{E 2;uc=k6l؂էuz,׳Ň2ٻ8n$W Y,9ov`snc|)Jsfl{^$K3"`I=$zo{wxnԶo66PKw+sea^՝aV=g럿ju-^{LT:P'TF++zTy<cMڂ@$e֠-f[nKIQP=9 QVÎzP}((+*V&'N,*{.g(YPMsYre%0%#tBR pw&߲{`}oH!ByX/HM1PG?\5pCsKbQh`yЅcN> D2|%Chfr]lbVTTѲLQ{5՜FeS"kabRځVV"+q cpEd2A˒39 -YTrP[2J-O)Ǫ-{u~7Wqˆz@A'uq)la)iK7oLQI6DhqZ‚HP ;lbXaצj::$V:XS:,(ZbQȰ e h;Ѹ?k݀{HcZUN+6  r$ ΗHB+y‹8e >y~rY:V $ÿFcMM1$1:T$(D',)C&]@$J벷L _CzHadsInSSѩ`Q^62h#3~m (~6Q=3-vS3k~i'Lw~M<煋,n Y=4o&: #IMYbHoMf!ѷϪao7] +;kޏ7nj,Pf}X TpShRR0I<؟Oطk 1&eHBP $&amnG QzӹLJo(q LBLOѬI"nUoyT##ƋMWo~xח?_}y^޼o^=~/sJ5tkK}T~ȏUOjW\ߪjUCO#oo5 ȗϛNr!9È@O9C|{+3pNO8W2)BfRH!`s%X$PEK_Jb7Nzl \zN߱h)H^Uݫ|JeD/XPc3 p.+cŒ}múN/:;c1X|g}Cw-~Hȧ' Ga ݹtrkl`fWJjIjӢEhЊɛiyi1yngjgyؖ"'$ҘPXVu5XR5cL>z3铍%+KP$,}4n(f2e)J&rNh;ph4de=ޚâO ADN| ȡl,x@ߑ٘)E" HE/|,'F4y71̎ޟ40::)4mjP6xof}9߿ifyH$t߮7ɓfvLt:Ww5WJfZY{.kW[^}F)5+{:9+"ضiq$ ^y\ؿ'a4КnNҗ\ʵўI9\sא} 6ܙbc&6ѣH z/X]3Օ\gMl46C66Ennm+υ lk"yadX{1PB'H#!r(UJ7d !՗s@ZMs =wf} >JO?dW'ہi=R, u'U-k_bmEzYSXD%6hlD Ւ3 ""x>nq3^{} wE[[\^0Co^\^DŽ$/Z #P )T L*)(Ej+2gK%FtI$0Prieǖ,۪4*H1܍qްP,T=c! XIpݢlʌ67iq?{yv1K_vǣ Glw"CcTBe);CJ`^UxINnb5P1 d0<%vMa,u}4WsF8+sWPѱ+qEh#YAt C5C+[-a@=x)2966DV %NXt::_6cw.Wު>($ Zܳ'3{CP%!sV|’S*qǩ(i@^?;!v#go Rp $$ 1@4c+KɘL9$Վ][oG+?M@8c 61*sE\%V !%J5(q(Ns3f{$.C52)8d0tQE][v}w380-/uij7./ӑ?o\j6?"!0y()XSrUi 86WFmf8zŋI|? a~?O(k.yE5Y^ջ.`*jLxs N\n:sRshZ`YaCȵ=I/rvuϋCq'~oZ2Mqn(:*q?羧4/^{5})s;ȍm'?G{:i~3==By4>;CZG JsE#VI=;oVmW:'Iӆ _z6C*qxg\) Ii@=E6'{=Q cMNpƣaD?g1=돫EǓlG̏ IA=e?^"u$qzoq)89ϰ^u|6WNQ`Eq4Hђ1fD(5sD%4P;퍤o͖8=iuEyy0(ZpF,w O hyք`bv0 XYUo[ϯ/}7numGc{>KS7 n u/^*eG; s;{0fʈ,Jh -8 kJnw{#gKM b )W%eUmKdkfx6}dڿG9z(~OZ: Yr5dn .|k nyjV5Fx@PO#ΠҢ<^L\ ]hYB~F{ ~0~`FH0Ǎ$ Q7Q0=$ n`o~ԉ|G/eX8'߷jJCܢ A  A8ZM`!F82䔰v" zd6&ջ=|f_]rh_o79U JeZ',,{֩$mI"PF]VfXUND㉦(mPrsQʤҊ ڄ$u Mka.ƃF&p n [i%1H:AC)AmgsLz.;ȟ >i>xsTL^ WīzC[A\YQ*#R_7XRo<0u' Z|!Nx_4_C&Csd  ^cWs'(<;*~DUyAj{Z ZCӛW/xD#-9#4JkM-`ʭ"vpX.y`*oSwUFCV.QgՅn N)A3X3/2'qq)cؐ|DcSK+k6W HppI!.\,OIU_ƕ AZj+)R ZR E^MM//Q˅e_cy]2a?LOg? NE]_m.e7As't \Biq2\K||g#6x"⇍XlVex#ZAfb/]// l^Mh.Xc=n9q|<&VU~|ޫ[8LpPf'8;>΋('?C>hڎT?+!ǸkyJ}ǯ~Fu5|');'Ez+<`ȕW/F17̥NT2rA4fRUH)DEf2&K,BvTOzuȓM~Q%"`Lj9ڥC8J ] Y?Vdx$/~n5PE68_jqŮENZܣ9BUP.P\Py,[L9OAZݶJ•灧)oX;RX!̡qGW:-:ioBR`@:%#.cIj ҃e/i]5Uk矛J="noy($w6y=?5iN.& nT2ur 8cAi&N%qA-$m-V)CutDC88y/#6(:&*TmõnAP`\z.CJj@KjU6FP;j01\%0yJR5u!vx-8kmn㸲/W2T҇]Kumbh?leOp RI CnbG8>}93}jW լYzT)G!qάʢꂐ^p c{I@K!-XhLf cpd"uڤl(`ͪ۹UC Z9+(%)f"VA O=Bx1ލ红w7s섶c x ((E?̗R CzHK- vm7 $SNl }f^C6fn3h PLȃ2Pt=%D:2= ^m..9azu{-mr(X r`- iYEF 򁵆ka:pi*>vFaPI;*  *Umɭ+ "QTc6VeқD)٠1.tl(Zn :[%jFSeY\BPV*U2mQŬA5̹ګ3̡ - C`P*ؒ* 5}c[O[TuLU\!cb-gR;d6$Ȅvu.~`9xc\:H'e &nȳBmU{r%Ōq3L ̃V;@C>fDB e! dEضUqSU5J$Xu޹TEnQX$FE*C DYl9i?tԞEwiPJ eB#޼jHR\Wo"rZz7-xi&=G HH/vTNV%TF`ҹdJ`-A相ճ9NӮ|X@iYɹ); Kd0umMW-d0L&Sѹq0Cw ,v1FhZ\r:%mz;h31G 푾4c|rI*Bј2R ; #*|DHz;`ƭzM 떍M U.[N.A!V9?C^gP1 dZة !,JAJQM5z#F}leC5!=ClPb.⌑TV/;^h .Z6b6DUKo!b.ńmG؂M6LEP$9D%. 踜f BBF6SQB Tv"#DoaL>0@]-j Z8_Q(#~[W1s;Ŝb =jꖰ,>By}DKg;8P]A1ݕ-_6,3htvCK6Ua!19BVѮm]uN = %:F@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 :0908ZhB#WN@ zq 9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@HG@Y p:'u@9 %:4q''9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN rui8?&'FBJHH׃w!JAN/ dL r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'^>~&'7Դ_^_ׯڝJɋI_.+4 /oM WZ|"ru>oMb3\۲~$Lf ]bz;m!OrϮ?K?s-݇_cS7F&3Pojñ_]uRC^k2MЏ׭F\2~4Y|p&Ol^ỚȟgYԾrbԴ0uěvmh?Jٽ+oXb\7tT&pmټ Њ1 v_]!>u] 4(1߁F4Amq䔘m*pzoJ[񬑍 ]{ 5%H>Xp i8 уAV#&i,wj%oK?} `~|-jz|vԖ[apZ`9vktS:xwBW{,@sn*AuN=ufld'L 3[´o_;3`zA8W@s}g :ܶل'5~hvGMO޻'Mob6yuu9Yo?$!\_@"盆:kw{ru3_μҌO`)ݒ,81{]Đ+.]λ~MyfH_-sPI27JOu?n炡.Ww匑|l{:yK;]r8#\iv4sgiymbI94v=V1˭ VߎXQ<0sV2|= ,juiy&e4^ #4дMh:imSEe%8Ǧ3f=h A5=>pC)(\ݶTO[<dLRZg6i(>l@i@rw cI$P7x )p)1]J8rrvsgS͠ !슟۞pO.Z{R 1>۶ _f|a8 .7.L\on{r=&u865lW(Uφ~0&[[>֯KJ6`Ri/7jNq3MI}qK&6\0tm?NBCIWP'n:+O`zwofwU gdyTF3/J(϶YݹW=:}hCy~6te0nW ɸk''C=zgv9y[.W?hyso%]N5dlՓ6E4IaZ ډ7p_EK6~wQ?82Tڀ櫽/'o5^=G|73ysA掶RhN4[Z|u2y|V6N^7כͷ_O;mɶkӇW-gv X~.oUN-,'B4Zs+qwAl{p4 wAzzN)hWYZE#{zGK֧cq,uv;κ[ ;6F [nlTd}8B2 i6.LFvu gYsֲ6E{E0).لxiI.WS|g ^'*o/O?o\]57y_p=]sBr = 7V֖߆ɷ\ˏe|>ϧl Y0c m0~vokigu_.lqz|e&k7'4Mj+f F)|~>/+knŗgRu*IC*1!EdyRt H&b>KY^/% ԝc-xeM0~BFM累'xj4E0 sN25缳{ S-̳t|aWtW"Dr !T>R:Mp-< N$B@A25Lqv<12t!>7׬ie+N`/0~0P߶uynhM)kN*baG60 4a0 Npąq60 vKq  gX"naoM;D'5/ɯOp1'鄷J 85%M$HӚarf4L^U\VProEk I7MɌdHPJ}t}/>I*Aqtt%Y$'lte7 &d[^b/$ {A}BHلũ.Шf-YaGY30F+'m馾e0um#"}ǕݧYzV칾9+?[ 9'/J%J\6*q]VlaG)$24$( C"QBR'XS)Ibq ´*LtC[{(s&3ͪb[K[.,;mZzQ~;%_~cb&8-L|wW|uԋ4Q1@ fxd9+s֥!5%ʛ0JUŢ(JFץIH0ضK!̋OS,g,& 7;1/~JaY,~St8fzOŹ5';x$ߖ"L| ӌz)J"b%BCP\ :E8J` ,qP=`0)RSهKfi3h1ɒY{YVTH k3S^3Yï!e\Vb#F :+ zI }vBW҈:gAh4˾ZӣIT(/)34v1%]T)xkߙb}q-b}~6Ml2}Hn6afI*&Lr(U,VP,K 2ީ0)aN=F}7j&q噝v縦bÝmc{:_ŭqb?dqGc0ͲC yEk]?NǶoO0tIq´)pPHAIе%]!:]+Xы.ڽԱRJ/Epࣽ#>qxIKLLGꩰp((o9;@Շ0Z~('aALokIx_3KL'$D^R椗r ݭQ) ?!1lql@Ampga6K/b\ue[|`IWuO{l[KOIߋHK2۸{'GgGv1!;QoTX y:eE-a~2#OhNs.7j9P'=P9"uwKj5wL`맟Jڲx~OΞKi9f'#D0뎐F}haczcg,ޘƲtXULh C8?M]p]<>~ubC"a:xϲ8ҿ2 \.FTB颥z['s[EMzmT$NL3/]nTJpFU&I%E7VT%1HXPa 4 %EXQ4ĕX k,mLY;E˜8_ͨ\21V*a{M1DaD E0hB19LXkX^EUQuK^lQ_f)KV)!_z"[vnw|2'm+4юW(£ ޠx)hyn*[GM)&+H.iԙ3-/}P73Bg<I+=7QS|]_̯sgHzQ|J}kc2lAM^zJitSpxKʒUb# +^ ܜ^cp, ՘(nVdrE'ˏ־R1@w#Yՙl/}2zwd }s1v;ϸc7/JchAAW/G!)I'a+U}DVw#~R洏 4Q 㧱@+Lmr7׭vG#k@ 6Ǥςrނ|ID+h,A w{I֌^ʭxx^ͭ61_}hkxėMCA B]SK ѝ3/´lgk*~3͜)ҽ;drL"&k!Iꌛ̋.H ̌vp/3T=:V;Y}Lq*Wi$5jE5LM"%wo1ӈF+-W;b)v;SO;iG@{%NΠ/ .nA}Lpߏ[P0.b]Dx[Φ:H*4(!O%Q6v{QdiCN2-۝'r[1 pubM@̾B>y=bKuUx PٟuUJ~*wcrW sV猢D@LQ!Vܻ}k{BTy{o}w50 xj9b`c)_!ZovǃM\l8{ /b)"R\NP#BR)#$q(XEIHQpҘ0FRb;6|XHiB aD{94Y,fb_+3nz]:12kǐi*CHf-^=HFVoaK̔8W!= OsXId!"Dl.( 4!L"_ҏ1L!\Ł;@-g.ptɵ~3ݱDB`r7ؙBVJIp1ܟgb.ƈ3bn<E_ n:\ZһzL|:?nxֵ`MC!}EMaA!Jd|7>p^,{zw6Nj]F~?M1ف]0t/{ePeB(L DfTRFvhzs|8AK}$Ka>Y  r}Gwgg?.FGbLFQƃ)@/@ W(d{P"τdTPSG HY:\hK_ޥݪRIovGkTڥ:Jcn~5 |-X=V6l]ݤ[A2=BF ^"b_0P4M"L3Њ);Ë.X7N^nl`GpvZ ^Y*w?^^ګ+dU ,' yi Qx/MM> D(x>_#_u xǎ8\Ksyѳ;d qaM~/ݺ m?xZ1OOeZ!,KcP|&@f,AD% 4={? /cٷF>g]KteRk4wʞYSd-kiY<ϟ=xZnWf=SozBC,x1B:mQ+%'ܖf pnTDMC̟amR S򸾵zY[ml$.զul/:4؇ kyL!2K}v1e,[v; Eպ+~tʊ]sԡNzIѡO/*zv0H}šXrO_ݫBy;{`v#՞R_եܧ}hg?'Gў/ԥFB6]8̭ .û`A$)J ܘsR\8 OK?"vXŤG1~ڗsARe lwsTȢ#zZ_"P'(s4&]/I)}kg^eOߐ;.>ܖ k])ڣ^MۍYB2Dv]͝pg6јXX_먓kge[Y$+i_[NHBP0tSkB!U-ǬQ!mqm}8EȮ,|袲\0OqC0Q!y#ݛ 4_"U)O'SȠJ"@% $?DT?J7: 5?'OwHЎ$hB}p3\ 3E$SS*nYm~kgر<6uŰɪYC/SVah8 PvOSAkV<<>u.[xe1 \l"F)\VT2[U/315t0g<]c"/6 fxdz?R_ɰ4 )*Lf.bdjh+beL)0˞)-qF ILT)H"Tք$]>y9؟hp))Ds,Lbd){smȮ 3btn"NY޽8&6ImviU\7/UOm @LHJ=&lXMH'&A`SX5=\0ܣ N0̦sS'7I }VՄ6u7"QLdn< Ed$] 'Zx8BNYjܔKII<=` HLY%De@atJVyE44ޒ7>VS\T= ΝwA ֜P;GsND͑s(Je㸒ᒊ6_|\Iºa|A3J, .(՘-թNUȓ\n$ 2XiJS &5ln8zZ'˦7v 漫3wHd"EETiRӓ^+S|$c:c5Y7T^ hl祥$F WR.2-AnfdQI@B)@&52 ̹J")#L&y&+C N:_^xb$R<;d6N%+ąT)I nEbfI +Hiy+p+?YȂ~&"kKY¤uA"N8֬K* &w%Hu\s(ta`/h'ku#sh.4ɚ)o?\(H&ǒK9aDy13|<5-cd(5$&}A8 J+mO@?HΠh/"$Êdžό>Qt;PP=B*B4T =!B7<0&R뽈]&WI'dk>Qg09LMof3,S6-gdDf,.h-U;-.234o yΦta|bVd [(A/y`y\g,|Y~ҎOOOE?I>v'{]2jc!k %HrFL@`S/Vm4om^< PΔ@-*8.I30HE*8Z=s Sw766>Mخ [ñ݂!h0TPnM%@+5L0݇͠hB fKE L[?<r}SIuʐHy )$J#uyh-D4mc.pbvm S+dZj{UϿHD>XuQY\֖dYG15En];nGHgH䓴Nu%K0+>`_f2*hy*Zߧ͒w;b7] M Oz)^?DEӻJ2Sh(GQxOspi*2:qJu/'p!޵7$R[]J~9*d`,ڲ(wFy=F '8r(d33D{OkIdf ^ucY~gR REVKW[R;)hޏ2ޢ󓊭}1(T6::laQՑ-cj$SS_l+V7rbtf!tڸKnvñB_& -jF7ڳ[Wؕ5F eOVaʣc5ԙc洫iZ= 'LSLU=O܉x PykvjS6KjN'Ł3Va9WJ8,cVܷ~nN\5zyf]'Ž ^e_jorG>~o ZouDF1{[ڣЅ E4pp>8 V? @y&ѓ~~wBGrTWj]hKYqe\~uIJSP͍eT:n6I8z*s *bn* R!Su wcEO3.͞Ι .(=r#p\|#HBe9Ut9?^xteKUNQt 63H➰ƪ0w݋-8 ?wiX~rBI q}@wH<+DmĊ#+];G% Sa6Hdfc&N!ъ2qv2"(V$w?Ci$"ZO8骆vY` {*:HN0ez\(fDRḄ:0مvCǸd_ffd3G81c-Ίa,jk8R]e6~{p:pFI QI<V2\F9yㄲxOSLdpR0j\AnuD>e%cm5Z5UJe ba|=VT4ulWOkf^~ޑ1sNSKL@dʘwF# U'Zb~)cؼ`{C+ X@zM4-`-XO.d޷C@xK3qa7?syZ}5,{fJg%JiI:SPp㨮B-9Cs tӏ<A)nG{1D$(Xf\J9Q JI m~5\~|6/j*H(E`#0x%4UcQe#ZJe>NVRqsBCL0S`t~rR8_Ko}(U 9eV*/ y8C?ET8hXkka1KثVo8$@HjF?Mznf_j(q~3}%ѥn$_B\+[L^|ˉtὧߔ[B+/`S$xHP14".%?S։h7ĻкUе06-;T):vN=2p_m迻%sFZO/X?Ԟ8X>/ӷ *ϡn׋ L\)wtřXHv=VJYgLU3ɱlL 3Dʁ~P3>;gܛ?`u"+:qZЮfxp; !GZDNrP \`%b!HRN5j}@3(Xc3E|E[VPdh<-jH ͔R/.>>W1ϦA68%{RoJ`$G⥥A_uR?)Y)AcfJsoHb9.8nd-($h)Ф.)A+$#$OA`当ko]uBqAזliDx{o9@˟TJ>\k`u!\H)׌譂3{xmoӛ3V,~@F[0`&ѓ~~w@(lt͏Hg\f%>ԯn+.U+ӊ7'x:++ESJԯvS8TŗAo/fי 58jT$Q9pUV1[SByÆOdӷM\Y2K yw5~M;ګ48* ?e<#Lidy@3PZ*$UH+Rɍ$v8*'*։^X~4fmI5.e^ 1R|, `8E,8iH@p:j30p=TX\KOT2h+l=d]Я $6Ѵ{e wܔC`bK?#"26. O8l=}THWSJ=xO1®$daen5L^qd%}] ]~|ޫ.{v8#-sIׂ`+nv$hA/'Bø>XP.=ر7|N~$ yvVyUsDFu>N<}:g IT^Ak:QV@BlIhjUO-I0fz.0ɭvfΤY #4ZAU0CGOaQe׹p.#E*94D>K0G"cK֑)V@C@N ^P*XߜJā|E I5,QAVb*t>Z~>tJ"~lLwj0wٻFrUrHpd9 .Cv eK,il)anVҰHXd 0^S|;Of1n9>S3,~"a?~kՁ_A_/2٧8?d?[`4EÃ/| *{P߬;558;xeǴ%0i­R*òS}A!y4 q<7X$L}lhF9!-o i690OT,,ǩ p ֒tKM_ꯪ]j,&>-.Z bȔ,9tXJZAe纙)_əSq`:F)K_Re,b\Wƌ^fНM/1|Y,s; joo`Ag\ifAӳGk8CclӠmMpdk0qF&mO|uއo f*Zߚ JomƧ3 ٵoy+r(YReu鋐Y0)vF[_ɳDKSܭZcfM: -Z1NġNP=9o`n`pr⪖tKug0^M@&@2~Y|9eWKPDs+VÓQ>mqN>uNZj(]z=\gjfL"f)Fg401[^dPg 4sJ M ))4KS ID`%edӱW@p2DHĂ*0Q "ʘmN *: 0Q:gtLR,Cr V~u&%UXc!6^2ŽEc6U@RT%ǡ$B`F)X>VB HB;]w0f->1x*LϑiU$L?LX&Vi0a|}y^oAE~CuI.Sc763g)1J=Sj.}XXDJdDdYo4,.5s6|0\LČ}xYL70 ױ 2c1SJ8V՘f\;炙и|~x*ص+1Puz9\Q-Z>rV5k Q_hgxi;ÿ <#G4G++Ճ<@Fpae 5G^^UM\y+eh4pS({J^ؘP[1:-4ï@hu8\ J/}Q1!qC-RHaBDp=D^2BKe.BbzWP,WyHPl&'};5r* f3bt ==#@3?[w514u 1=c]1$V)~EԚ;loodlk; 33d!I9`lB&N;W3wK/ۜ~s`kP_ pvsh!9/7g4>kLʙbT>b0Jz1H \6%$[Zqi:c+ӌ& ьzrCQ]`.8Hj,-;[x4+^)d%OE#X9V= jeyY4ѥ!،$*3f. /i)/`ꌿ_Li Zzۈ*E\E)k-YN`  J6o90r*6q<ƒV l̿]"w>nX_m!2C ~7btx #@8CO V̤H돽ї?R"?PE{.ft0+F/Syty~s<ͫ|,rI1?vglb4y!??MOa"s/~Ŭޡ/]7 ! S /tg\puI)$ԆrybGdw>џnsQ#KrX69[fx`˴2!+ևKEl@ho]j*:4k-\I/%;/z9l5M C_ F*l^yӾ RB2!Q!t 2KB!mfӤ=̳)X*< nPSfd5"IAmkUnN杫X'fF+5 ?}hLao?ןN],fWEGV {V?uУ =Kݮ+&Y2 FjURe"WJ<2)%H 5^/,Nse өEe RX+hϦva5b:<<4J70d`547cIѢ#e*Szgf0l6}yxbS!/#.@9NbfW e <[|w\@!eƁ z'q"e`S-1cg,X1Ѿ8x3b52cCf)U<&Uh,ͦvTs! NJ݋UtMϧR3[5XO2h?A" I7ݮ+k&+uiCщ΋l@8v3eV빙jqj#!X-fl)ʔ2[Ψ֨8o~A)p2-6O!Qb knFH%͇}@YviYGixo4벦L|24S #5ΑyE duђ2)s?n΀R;OwF[eߙR4kA}ȭXs%,PU[橉irL?=cu(0Onnpc#׵xzP ;z !hGځ{̰fYtAI91ŢѴhK!\-ٯ;f(k^ &><2ˤ;ǽl+^j4CQ +Å^]`Ec&Ć|tg+?+7- 9 xz3$Ze5`#W5#k$COљ25#4 ATTkfY[浴gM$4o`Q8e%)RjH e S%UlXTEUB4s3|(` 0@Y(eCysZnQz@.$ת-XI쌞Wm $^AP9^묞3ѪpyYAT"SbնZ`4 5Q;ڵj/g "dgh # 2!RD=gԐWrǵb w=:ekAFFGI-V x#je3JjP7~շ_JpmjcSp,A~Аv |v葢-ٵj^9̠`M V8y\V gWKR5)gl-ZMa uew\Iup]JV&TKzbqK[p.-E A2ˌQqwWF кRdf0Fb3 ln4BQR* (Nx O~ VU1)PɥHrRQE9DKdq4;Xjc?\kaw9tR4,WvYϗwNt UиPQ{2Źfw.% ْ1T6d{Uo-qҴ*SѱFmRi⢫Z?HKƅB%rV$^2[)0>}^# :WиJU:?OqiqZʟB?7aq |%f_ak5C@v<MUMߢu6j̋.j9cFpUKD F+SϰT!sU(dhպ0 b}gcb5%]+ӊFѩUwmmY~-gs5nXt}XtCڲuHRJiIE Ŵ@"_\-cL˖)H~ *:FĠ@%65>`M*(p_V߻K[{ML-8Gr>]9SPQv]fR-u9 dz܍a]d:N *WZcemUs̩Xo,Q8JʨX$éy}Yc_Wq< E4l6~?}x楳E?K5_?»EUHM9kv^7znO'@bhJr84i8)уecd:[]B*2dQHh9p6 S9 4t< LXAkۃv!u^.2\OO{9 0 ]M Q$FD,RBX&H3*MaN~6om [`2Js#HXeŒ`~]tEv>w^El-9`߅@'Kd Bdhorݎ{ʱ#Dtբ +h".)w%2N2a&/+KrzFJ:g=6yc>lږV8^]6F?13S{y%*KNu8%2UC|e*/:u(qjY!u4U(ꧡ0;Т?m޸@}:, P1I /PC닅=bKPj[B+%;dW} ~T8u\f\(YӗrXbJ:nm"5k\ M PHDXR8P* b$ mF icC?,/kyIӼ~r/N=x( Nf }T"yH)_r.d=$Ѷ~N `~zj?sp⴯șˬ? 6yL=U6.޲gDžg#)LKêl7zpoyQy[ a_9$LwZ$hnO+9wj'~ꆒ&wRKvßr\.&^r*92/"e[qtN5t= 'Rw. /7WSE0ёaT "{1D#IxLR|w+%%'Ιt-SE:'PXAR(f/,g0jNT4:'":CXGtPhDwNBteb|ӫy[u\ƣh %5nJ.vG ʥJ*_$ c|4H#! .e}qct%WH/-HQv!ԃvDKg~;zc95˲c( j>M&:ڹٔ8A^5}^z^|~+hCylR˸'|Xy=+ӱ_"ϔt\ɜr߯o'>$^:,9Mc6DMB4=ek{P8ᴽgRCߩ ( @cה= k%2zK\n;iR)Z(Dۿff xiH-˻G 6H:괐>M:m7>"eM|5lӦ-}=ޅ;Ŝ`3Cj18wt&LuIQ4n⬅}ԡ ١x>Jh(Ż5AlYUX%f>QM{^(@=4 3T ۲Ϛv ;i ":2 e&ZC]͏bw6R˶@8jqųuԭURyp(9T^p ?mW)z"Dq@~F{l<dY+x]<]:RWz!H-S/U˩DFn)2e8ZO蟳?㙗nQ6`9}ߒZ¿ ,RBdq{gWT\Ui,qKXvoz%H Ք-TƵ~ˋmN-}~[H>+6s[/B;}EW^|Gy||sޥQ?D4p~G$AE3&/A$}$`b4CGԇ_Q$vVi!L8 )NiùOU@&p$ czٷ Ų!e*f%+m4^=`,w<{z?ʮ-| 0iXmmDVh+)qz5S ~ nYD@lH: m]&1].&M>2Zcլ򊭪;HŸ$tfpXgmqɽݑ/}DI8U;f +(! 2v,""vG%E'V-W2^(7q6Q\U\;VtYF,Ef=ĴnZ~P-OTpې, n~*Q(B9.4B16K?tO0ETs! 莫hQt.:HvKOX0_+@8ˏ% ( 7qLMq%yc.WܫC=[+Ya"*x흂%2h3-27=i YFR?u+|%1F7p0 71RQ@lցG~mϲqJ1:+.wD7dWzZKN5ԿTZ+Utr2mg&$k3q ZueM,/}xDJ.%2NrDuteˇ  %2Nbk;{̿ڼ[}Ž$ѩUVP_u~]!|N@FںjN=FcKqX&ל tbI(>xl*q܃~zYkgO(}/1_(yeRڰv4ɩtWŊ 1> QM)|m ?T-+O(7CQF K~"ѵ"ī,$Èv⛆)@^"i5 5$)EqԞ8Ǔdl;i[W|bmR败ٮBla5PN+XUBz]&: 뉩G8d2vw}x)*ySǣ~8خ~[cܩcwI~: h%LwnTN\%2zMeWD0*.[S|AiM[+{Xӝ MoMoo5\6 1mǾ)^hTF*. iCTu5/6oh 4ӯjL6|)_6ON*uSH,iuoa4]o/D7@8Fkv+{|܀g)xh3gw_O_ʱ;c^UR\.m53/ Fؼ~y |#=OƴqKvqݴMɛēyhCvn]1\EXn9'oZBa\7C͞L ֢5Th {7h#(iW^@Zs:οSw[lH"G@ PðnZ=7]>kh[364j|5o 'ul-gY<w)-׌wdѕ<^>boK0QxH ǥ@0$_ \ 91_BD(u$WD sUiD$ 20k&qejTDb#$!Jxٷ4]D ^,dg{V F6񶮠 Sd0ܷ4@mKSz_ޏ+odj sAݤWE =GCЖ4pKOx!\JT CdTpi8S)kc _93!T&Vnou^ e.5j¦0PB(xǸēB6p }|B ql4 0`K gy?{Ǒ1O`˪Y,zXA 8]9 m,ɱ׋kk.Rd[3fX92z卧/\F/BM,,<Ó7wFBD։7ݐ\XS'P(ǿү'/ًs%4+9%5 b~{g8/oQ,s@xS_Ƿ_Pε:njrڹEAr 8( qA4,GQ-Gp_wT)]9Έ=;V5+hOGǁ/V;ꞿ tYMes%cX8:|pyOTOg !q+7Ct_pg\`bG~Hk~+&} b\>i ?\/;%=! I?לyM9%I_3QRy3,);L#Eʧi@:4;5fXǷj^ɡt>Z\X{l@Zܮ~W؞/A ZNi3N>-28nwDT E7%~%7%`Q>Q6(RI7ۣsA^8MAWkqqHT5l#iXM<<߸:=9GԕQ4Yxj8K8.Q*BicT^:_9KTp}<J>ltm(_8 8SȉCh7DIh9\XdRQAZζe6ִ`wguqϺ7kdzٖ<`1ٞl zWADwM~7wZ  wF 0Ym;̯1{ nɏ/fRӤW/~KVÖ|<-/?|<ߝ\ퟟQ])_+9fg/7lm8xqN^U{#672R?.KIE{;ѝӷ,Vy.x\6ovL?n?v+vg^rq%cg<,'c:e~GcBaOO7ª倍I@ tuk^H¹))e ȑmUUԘ<2C>)&T)'<*k+Z>UB֔pR ,HURh܄cYcy$'>/И}lh5 \xi();xs5hO#6rvӛ;H+e'3kPDC  5@tQxxȍݲO߬.g H0+vS+RJŃ>N 4&©yv^jߜĞT&Hinh6uFe6$057cT>N"6 y6%/ Mͩ̾v=5a 7GpB_iKv#ȋY;%?ggt /4;jF;AKҌ]zk#oC]Mϸ4Di0O`,P[5]w3lok?au_ 5^Zk@{ hu zI*97%AdR:T $vA/T @U߁$V%G=e F tanu0/;|>k d^;Y)FC6ڽ+y&XVW4nZOy88Gu0Ggh% vRTZwnMOی@;MҀ]v0$f{}GӁٷ~`F @k@\F۫ou?vGín^9{OU 8SUnlC!k+l˦TXnNzUe9Ujb2֥&ʩ*p3nUV@oIwIgSt{jf/d{փhອ@ zTBպeXLr)g#QgR Y>Ggz%4p(͕tJ YL4<_C5A d]7NTLdS XSl4S5$ k#@oJh%>STInEx28U#FF@Ϩš1jF܍́?DwdlXrS5T#ְo/T[r8?gHL FyA-tE$gX\|4:iRPbKA4Cֻ#֙kN_K}Bkh?9sE{ ns*R %>1'7Z=e xbN3Wl5vi2;)[Vt SqkխFcS%15bf ݒb! 4)B% rE L2_WW:WGH~pFv3Lѡ޴;ɣJQ\!쯳%ĿK߼灿M`JaTu ECT,EIV'?)Roz S2(Mr{d}w؈6F16;Ʀ:eo-op8t6dT"Z_29C)0 㯠C:V}][RJ!9 ŁrLƙb8o{Dg!^|3i"ڥʲuR fX(Nd ~< s2h(%P)cH'v-MmsDF)#)"ldx)PJ)WQQl292n[{~>rغMVc?Tav}x\(w%\"0CuKTRmXT٫cMnazlox|6]Dw2÷}۳r_^ NU8QFX,J- mTsnI%{OEY,KSe1h*m)6*d"l,opWb)49SzS?w၎nۮ f-rmGeN*}ŗW/xSiQE\Ys[TQgymNt L&_Zƀ 1,$dUp,kk^T_!'{E4Hab*tw–+ƵRBYy^YK[ KkiH%;D'[oJ6Ā)iZxfñ⾽p^Hj()nei  {UDNc Ŭ @SKsne?DS Nig,b-XXGR/|}FbuRtg3EAk*6ń5SliAbO5O~sڝu:D[VmhUX<$2"6u>+ի}HaW+XnZ3ٝ;>`7uzy&!hl1VȢ`Tz=\f-2 -ٯ ?Qk*UŪRIl*x5ZHd],l#=*G&/ \(Xj$hTeE`NP-ZIpI#2SCjȹ^% HK-#RL>gsm)zQCoQ]38Q-~7(Up-WZbĦ&wt+~ jnttK E9'Ūk9B V1,l--n`d$1bKf\B3M:A6u4'CTtnʯE!&mZ$|+E2 Zkӗ 8A"-K(t+ΫPݡ-P`nrVVKjѴ,K#+WY ,|Jxu)uiQT\ \m,qŠq1.YiqFBqlbӄ8Uj޸HKeB=EGX* mXȣGT#J2DBv\Ny}#Z5NWhe0T(\U^­FݙPȲD$ҶQ#PA8- .(`YZNֆ  T ̧+-dL'cR=DdLj9V- hw/4 <* (5EY'U9rBad+Hvh_,B9 ~Šĸ% 1;9NOpkm&_r?٪Wє%eM6U6mq% d5݄2✄4&Wuɔ5g8xJ) -,撶bVX!f͎[%h98~bZ:O+6L9 Հwel&z tKApzҽŵa:!;-ܼ>B-:l*% iWqvSen ,Rb/\abrm:4):Kl{e:ԌwC3H4>DgFh@;FhwW9ck̺c-+[nv7Bo1 8 tތ7 Gh 1ݥy $ Z/-˗vڀ":dˌ(qJ59i-)uSω[)#JpLk;6G 9"/m)-hGulR-ΧO0׶hյ`A-+ɞ Ncn1[By+Zٿ?m h]V}XGS#7Ug_bMh6be~#GVtA* 'Fし)G{gwFVvGn#ߠ߬o환EsŽhwvxvBϟ~A7O[?:"WuAE0fwv#nSȋ_/D˫FͿ>-,Ng<{>.$\m1M2to5mHء6&lI?m@v #Q7('T ;`$#6֍5 (P}&N= K}U+ldWK3]G_~/#;! @?pzP>ſv`<ئз~arv\ <[<MҊOJ#F&u0y"e>d@a,k>2 2\d^y%{`FHIMIي'co#=q #63tȑR}޲B*F{Y"wI'~6 K!zr._3C˸rųKÐc@Χφ [g wG9AMWsd % }Aǭ$ןЃ(_xA=9WS Dܻ[ͫYU"CW |^U7\<ӆmVhv|UUuN{<1^}/(d}i2>3{T{a H)bP§OCx e:،S":"ɯk"0NÍ-_;a/5P)uI}" Q݁]ök}Tyzq\Z~'!Fma.dȞrwvPFxa2&(-ϟy;vCdg=-oɋj`CQ[Ek.s{gd#_[h9FFGDmAzZ g' A޴y5<{ȞV‰TpyAiU{y$ VE|A!w T(&ݫ% 5ҍHhX\`JqmƊyt_{pDQթ_ #O^};]7y5ԩoVfM # GCN7r\(^Iz/{s]2!E*5}5p\N'16ޛMvwZߗuOrvu&d6LM C"}Ƴ_R?g{<[mt{z^懎4.j cScx{݈h=|Wih<sl*pytuP {.SYiy!|pJ9JbzΆ7á4Ĥxx _qku ucnCX0jmG_井cm't<`79 h/l"XM2x2Uq jj- jF(`=i;FQ &\[hơo`&KDKBE!AڶA{O9Z?ӗ#j>r5x~Hi9;ezEvÞ @ x`@y<֋i붳Obzn!ZǸ{ 08i1NmuěW.8DK\fa1Ɉtɕ\ \Mc_c wlck\.'A\e1BWw˗Ŷh>(%vPϦ#Eڢc`.=O0>έ˪γ<}>g0\p9=bp:gZԠ MXm*l2`#-vPu,[bUVI&p6[KВkmE{U/B 5 Ut'_YWl5EG_S58,zAٻ8cW(\xW_ؾ>*J( $Ze a2LMOwVV{,AueKZCT Ud@gVX?d_y"h2Z~^!oPr(6Z KAƼD_m?9JiLfPŌ& S+ziRR"'L: V`)&WMV|4>i䗄=Ҵ9 )cUn-rjCa]tWӹ>:ZǨ-s. ēbwMnK63Fv9: .椪POedh+=KA.{6]~~wa?b'&tf\c@`=cXV-Nh(ޒ-u-<^eES25axu[X{Y}6􆘾`""*_FQ$UTYړ'CF~.g%d VP BqY}Im:L(Vz\scPkx(xW< ެ%"w'kget@qW}^Xs SMcO rkkVx:[QC1O)Ef12])W&43J*֕U&:ҕ~Q*6jS.RoyLHĄe˷>ɵ x:8 BDѪ H$5Δ&LEI{aF,"nnm,>ޏځ[q1"d8qpOʰN@3*VԭZ|^DVC ,4ΔjǢ{%"M%'k4&x4c[ ݆D_mݹP"Zx 6D{nҧP8&ژ} '>q*`AHрmhr1G2zV lUGJH+T1E02%Pt㢸R\zC=QqJ`gC8$ ?AW`ƈǽ:Ĉo<St+s'qRr暨:\@#Q"g +9rE\hEEgQ⳨a j]j6eJj&.4fr:O0s]S8B]mpOnڠ.lj' &b@t7ᴈ;Rd:GU^6UQrC(QUH59P,[E d) s=y+~w燼^dd^_=p|vϻ0NǃGUĒS4u2Nv鴐 XCq~+o~7u` VR(A7 7$l[|>Ӊz dPF/$dzh>UeQOWKwpd%e!LB Cr'GU&]BԻ~&y `@N}'y ڤK<'f`D2ƹQKGEllU(h[dĩpK *vUn>_KɀwT9עb~|bjt(#U&,b#3Ze]?PY?[ l/#_ŢZc¸E<-M!m c$Еi_wђn޾{GӶJvY$aa}/XY@>&UԈ0BQ.HBnh1P-q[- qۧNsI\K42FKQ-jCRU_9!sk,AY]_H{i57 JQ)z]%A{I/{ʺ"t^Jeʴ kCݎʕtIJ铟(Rrst xΉ_v/݁{^W G/]5ғi45bR Xk,RUm8ZJReY] URy J#{r 24tT}1qc/ݬy21 0vfs60Dl 薒wI-vWtLn"t<nҝw֋ٝwxn5[hE^PgĐ憡jp6?]02b#K bVC>)ql.H\3ӡZZ YI]oTmgewRPU"UmUbd"{Sn6.ewe͚sjHQ>7y>elk̎3{^7Xqy_޾$Ec)ȓnU C[ T:NH]u/mCzP)dGW8 a-yx]?OΏ@뽍 :hzpG"rݬy2Ƒ>G}ɽL5n)9wMEq {xOqclRGkJ\$"99^u>]kwD0W(GHn)9wMɧt[?us&,XYw1 ^/y:/mݿri$c9w/C&:V߫"`%|"< y8ה6Z[S-2xHL~HyjW{z^d0Z|`7ewGJmog6ZYq+vXfa1h-;@p9{M CeƋTl!f18)wuow?}@ݦK4G-= 6L[[{e[bg 8Qh-Ŝ|=:;*_pY2c{ AM1 H-60Q@woGbP}E{urxWyw7WC6N[I;.>ZeJf\Mi|> ]}>SQ2\?~5cEy۫Y~TYF'BZ)%kpIE{o&M`NfrwoB9v9NiF$jh\\@ssJoNo gJ+1–T$BW]v~6q9[JNӷdoF#F؉2w/|NKEufbt6׭U `QkX~\U]8&Ĝcɭ];"YQgy u jܭuTTnv7bWvD 1X!ˎ+wtcpM\ƛ]1+XQdd1QUB#01!gkRŞ3I[f[BX9LSR=sSWUj+ToNjKu2 ;r!WIʧ0|_?5x@$@t}Ȥ8-ϔ3j^@uzwl7ZSc͘N<[*9%Pj6=ȵw3:O) #`};AdF=&Êab*z93O0ap雵ǻ9)W]szZ@O&4Y2(")/5 bf+3_&Ξ`PPx?eJzBslsYPl]I$xNB+l|$',η-Zy{h÷^7:1j/ 7yXCe_f.&d(%U˗Gc tZJA97Ûq J<}aobџݍnb,pq銏ݞ2^..n'7 +>砜NjV┟YU~}|>=^1ID,tQ=A I C*,YBZϼ= v֎8=qRmo"t{ kc+)B R\Y.-5}dYw`8Q~jntB1IM@fۜ]Gݍ\AOwrG5jFו(IUS5jD)Rkg+H"CZ_L0NK֧?FhXBQ?@9뀞(c($Ze;bny_S7|i~dv)튍A FSӞQo-Iutg s*uG:YiF6l $P>:E0u!ŴoBПYnW{yLq[>,O*wz׻`sW3dO;͵VG\8m.AaBΧ;]<B2y`-{ݨKfU?%!T>Dz;x%)$4i44D MHѧ|F%5O&+O{wIik:CZjg@NQD,&L X c.0JE0ML}&.R7?OXͬf.~FqɁ3Y9Y xm9 ęH ~Nd_ʀOG7/퍅oq8u)Fv"DFN3jCŽûyXglE~(*'Q\\"MR9q;Η4>>m…gMeY,·8'ُx;xss=y)o?Ӝe~;z_ l1NrWeQBn=oEgoEtZ*K󳷋UYFZ:M1e?;8x!EdF?ɜ4ZqWayL,W{āOP[ZRqR%+k%g(^ܼ"Sөh#]v]֤}NHqFEJH)43'u [;d0^^܅xUR7xr-I C;J(jBL `ߖh8:TE9Td0ݴ%V߿/ᕣ=(gYEF *lܦ X<8 i~4ﰦ"1.}F.oJݬjx]²@9XžX#@\3\ 1 NpjL};MCٟvmݞ"6B{: g!B>\1)1MG ~)*FL"I$Mb &jPNkjprI Ơ-e#Z !ф6u"D9\~lrR9aL?_J X||A,y]lB~+b6YX딤M6QsƱ)Cd” "ocM+AWDz{EeZ));"B*x3emw~$J:Xno$Ba@M vy]Gf&hK$: bZ|!]eX%2 ? M>܌3Z슾vwC`T}r6^ʘE{y ÖDS=LpF 7x3Y;C;o;)-ߖ(4U6a?|ɠu9\S6eHETN3W9/_yk-2y+0ˋYj C :ΣvC)mh~8:/6OnQxt9g),'!cp}!KRvAoU"ĉ0OZ|zYjC)>h[6h(~ Pf]P71. g \(U><1N|A0}¼9Ӕ&D]& uصt ֙&lٸ^~?ވPc4n;Qi^ht~X|NdW\ &e:!Ӂ"-; 1BXtpũ/g| ٔpuw,GO$>8Jܔ5qY "R ?(A~>}imRvX4G!9,y~d^`kƘ%XmhQ__ yoz4ui68R[9DXvI/w-/́I>]^(1zx?SiIYnnmM[N_Z8 1ǜ#1u%jVayåh͜5wjAL:'Nmk@wWgmmBcu-|<_zKvpQMB&^pL'"*ias7prƹ%y+"$,3Y (nYj01r׹Q0iEn}^$EI&6dM6xo"JhܕE?vhNRn.cqMƘF@vȸΨ-c$2D-$y=[!jPfhNpK˕MͥM{RCh082qot5>+5{o^Cfh. D+VXI#)Օp6^z[:oⵈz9_сFH˖ۋɃ~Є'Yio Yg& .PA2*PxvISP^Cld;})`-欙byJ|8.h%ÿ&Gkc]JN>AZK5jr8?:uJ#*Xx/ XL=T7\ģW&x.kЩt7y e$%ot2xFUjs&N@ ǭz|4Jp+3XK\qB-$GO`7)gxn$FHƭ# 2r˸-Y1cBhص8C 8{ bTƷ"A?l4|cku~-^W:,7HtޕλdMVӢ_T?fsPSYYwfpQv6߬<tmƂo|.QkKZ kM'j5U7Id>^});z7n8(|TGUnӳR4 Th *>hQR%%j; LoД+oj7}ŸP -;0AN";4߹JE~ld4P;{S-럥)9]9Kc1T[>HO?C1Yȗ?rP5g$vy83;U`@%"Uˠ \"D=q`KSČ ?FWB"z8%Ra-~[I/W7ZJq\A gnB&¸3;ϟpMF-3ӳtb.s `4Z5Z2FoS2HNvRCLQ8,@1.+2)\^~ׯ>?xן OO^V?ESY)]|o{ia4۪)Da{@Sݔz8oGHdk >oq8%k&?ռg*YV L~p|Yp5.oIG!O|JF#]'Ɨi)\;ϱh 'ϟ//Q}8oi3\\\_ s3R8rE1x-{;p\\lu-xyhP38IzvD #ڭ6 +pK'Dx.ʩJ2 RԔId "?*̦TΥ*gv*m}o59KFJ1-!2#"uR7gR0t-sCu!CF<8 4QARB0! FsVY?JXAp ѥ& 1Pbgt%_E3:xTIEruO+[SޟJs ҕr#ޞHNU@)p\#4PHggjZ}\Jsmw%.dܶt#/6Wl857O2#/Nh=8yͿSOn1U~LoRF^=2E[2ۢ-3 g9,'vA;x CQ/МFgM-Ѡ)WPX0-/ި*xv` B1 ͅ.68{PȜNZM }̦JI.ٳ;hdR*єQZ^pK"c{*P*TfΜŤ} TM`{r^&!j]|4GLӣ~W5<ĉ_jfG r)hT@r*H&p4(J2FdgeT3kZ+ts<,op*;N%OĘ)H)͔r3or Tp[H"iR*RW9P]v\kswfP%֩}jrh>:`m.s;2Pq/Tb65T,AIOQiMx0*Vv1{ЩCsBi!6Ј@t4BKːk)`eywD3 H ~^Õ=@KxĒJj_ zFL`{$x\BNVhjD0Ա enTJzLG7 VmZkU*29'LFfArhNK 0jID=Oy&q M򅌮n3iۢU]h~PPΉp*:MYO!QHCӈs:dQ 2`Rİ PRm%ڼ1 3NIFRM;s%Y&;J%/X$LhbӺ**y«*= yYgs!W{$z[!Yoyϻ/uH!TüiEAApf33ZA;`[؞L\:l;kSL!h۬ ٿ #&>@Z]0d,ڷ!%aStFSo6pт<8Pw%efs3G9Uv;Te-v݁/\dM9PRJQ"kI{{y 5$6ؕ w\^ _攑-Pע-$LKRjYt#Mm&J[ t"FLw VDc_"iV/)Сr q°na B$Ԧj|[5솗#x94FmN5dq˚nH2<&p=Ni|4`9ϩ( e)76oaIj.`H)_$-_wB\JvuGfwn(ޗїrK7*B΃WO$3iuxaA R8cTm4=vhPku(id/klii/(pR:QD]eQUQN C9ٹ Y9 s{+JmM 4 b0^Xr?^J$WIw9W%ƺB^7 ,j0 !bf yvYs7H>+j<5* t+pUkXn/O^^6Cuۅ{/^jf畑=WkM=9H#& _22'7Lv=^AH%'FSfeǞPh. ۥtHzRhߥBR~ !l]I;{C߁W0T[+Ul7Ɓ 1+AZ8(e&*I&B&J&_ͦ/8(:A@%x_^'2H~0#A̓nmQ*E5Vv#1AUԀVǏO?9_qsGTjl԰=E Nq18Qxw./?JdMd#q<k o Ԍ e8ϼx7~+w-x; :)DE{͟׋C4q~LiM.T=^_ώ?;ϟ.-YE_{Һ+K66=Kd?7n]JOo9P镢JhV*lMNnW$??VpXřr8pV<62TU&b@LĆkM),oR̋ 3|ן7w<}l%4ފdQ(>O^@J Bk ύb1:udd Eǹ s1RߚKF_ eD9)e^Y.}uK_WgWVÁQ$OVI '#,T,z'ET2KAp@UkD|q7ۤig:B+TR>_%-Fh5%J'ݤ;5m|fw/]\gdld_;gYx tɩ ӄ ࣴ΂1l i+ LsszrbR+4ڠD Yr[ǂHhn)Čf::8?wT,9Qr{KhTrFpC~򾆭$U3! U |1A;Hyi@%:5zqJWSiɦZCtb4rL*H:(ڬRʦ N+f 徢|5!J(z- Q2zoM7uaY-?IOwWl!xUك tO>vq-Zok;ib;_ f 즶LIf3pLI֪_hQCrLOdA3|%͊#'4b$AND+DYMHr5 Me, hnfD8pFk~.(@/NՑRk|{QijpD.k3K^QvA~6_w}K\san܉Yn^xF_Gc7}VX`7@f s+~tz+|oN'.WwowvK!K+5~"ƟNȖ'0w1g+0Сi.gUGz|ѿ%gzNhD#r>uIev=fg7 }8? ?ώ_cχY_UWlnt6ykuw</[`=ӽ/^.VoP9nFc^7rYt6Gm]:l/ۛzy?vv8 f\dsy!>Џ,ҟط/;xxЩ;dtxu<5 Y':f^L#]oo{hҙw>qwoi&^ٳ鎳?.|0Ooߟ% x~ūaTGc|ѽ9壺y6z< Y7OS K? LYgx^u9Ag3hoR˛LMO8A|MPQUk,]6I$X`YU;Gn"I)|qYJcm: 01|:1 SRH^C|ɗ"D0qi&qg;D)fȽ?-Mp=Y6mGnd/0zB 5t'tՇTe=2JrFL+/Xj/3a7(ieg6 n=\&߿_٭F>ñI8~Z8T} JJrWރ#=z_o|=;z_oGѧ;9ɘ%ZVh{$I)3o+Fa)%ܫ>MpaxuG~Z 2ox^| *Ad^o !uq1@VPg@ZD)%Q숍 1Wi&GVwȫ֦l|YXɳ!^`F ݢ4dp0a GB߫aw퐥w%~ʭm.3k1).4).xci٫*S-&CSVؾWVY;)N@ˀԢ!H$-Ankaҵ҃]@(!|@ABq[x{Jk)Ӑ;Lr^K֭n4T?h޺u\ߍŌ@7TE\U-!Jee;,)7 EbK@֏ۄ ׶YlmI:J=YU"HP@nuJ7^hŭGn$)yq)c Y>VoD qjTVC.Jo[tE6\Zм-Ll @:,l)9! #uXh>Dsʥ^,g}) *N#sPD2[!R8&넄JId:5*f ҘJ24 c\ɯ 9JTrWH` %bkD*"qbqDbe"W)K$GXT**i->M2OYU\@Β@\s4[3F *UO D>B D>BT̃BfdJ5?Hi6g|6gBX7]-**͟b)d3!JI @b:HXca2\2RZKT(^ $ \@%P֒D0DqpUH*d$`:ɔi,EKNh`B$F:s5IYBg :ܡڔUw^ i|/_\n#Sˊ%`4'//@!SM?h}yx2=2={TgZl,ӳFowx$|G$b RN3 ?B2s0֌l^[YBvn)sǘW"P` V6JV*>XťW~̇7\*)!QJ| 9EXas Oe Ia`?\z0cZtT h2Xҗ3o/ tGx/4b?n]RkT]% u#0As%6_5'o8g MSk}I$"±QӤYk`^Uj3u*mӘ~u1BN"}$q `-m) = V 'HoE1MHHp)Ȫ A -B 8ᘦ!1 v‰XJ*# _tÈp$5$HL ;U ÕZTԌ 5 ćbcJ 1I!%p΁iDT)1FJ"%db5#HC-q"RcҦ$ƈIdO%ՒZ90 4@GYi ^0F+δuL$@!Cc d("K*yj3MbJS2lfNдX 6uoҡ95~g=1@ VNJ/Rj:jbHitǂ~fXsfcThJ샻 "NU;y4V0"&e T-ECNllhw7J)hX#O1`^0.zdgṲ2f@i#e8PW1b̿>"Œ6"A'cE1&Q!uJ:Z!ʉ!V qDsZqL4#X6U밓օ:|u)(kT$#&, r/OkiDlcd65\$;iimF,uYCx'w,r2(I17Nb,̥QՈٍۓ)?Ƶ! 1`kI ^~G_9u8ʪގΏ7Yr`;Z"Xؤ`KjO!$ֱiM xHe2sLbN0lE~~~aiu|;Y.w`8RԶXbo] ~< @zgz?iIn 7/^Nt̺G!ƅ69o~Gzp~DQ>pSJw8Otի#O(Y_X't{}{w@|6`&<ݝNOʞ ~挒TRxj圢f>MT`6 { KPEMk\Q8Vy]a+QV~t\L-aRZX/(:A;If9\ZVaT2B$"g (wq4<a{LgDp]go5d+[D5U' etY7></CţuQۼpP~G-]\8h| Rᢠݥ ˸p-&9ŗ^!'v$[,<"$פ=^Y} KZHlHN%n2zxb=hӔt!wх5_]̜lzJ#r>ij'3y$.*Tt  jP*4e`0*lQmTZj7;P٧I)mN#$:) yFAͅ5&o{ ]?Y oXdl`FbK_-Z-eE(31 c3(òf\̼c3NaU{U7̂'XO jZ5F*_X[SM@@ 3m8K7D`,Ϟ?o&))Si!"J҇z{^8m W,?ciLOO OJ>{ϏZ p67|vyeܣy7?쓘5bQ(ďU@4 @0 KjՈ^ 9 _=IsGʈ.PH(T0PO\}Pi\,**ԇ΅F?̭E3ScoAE$Td͉#b.kbjS^3z`qҖv`.3R9(@c~GZz/mM|cӫ)Z3\~І08rxlGa7͒{6v 1W4BM"X$@`XQJPh$q2c?^c3t)4RWnj5p0V4bݗس n;< M8U-W?Qİ".ǕRHFiֱL#G Uǧ( 7w6NFd%;2+fqH-(Y τcv<͂a{PN: ΕT#a$U`jCčYK#"}Rc dFJ)р.T* ii))E[Qn8JE 㢣.-;}taXPfš` 4܁FE.&h&7VAOe[D.M.vvXRY4m"Zi,MQFFNȵl3iJ0L$5k~Cv#Nεq'v6SkeidymĚ>\TYXGrĝ(Uw*uǮRw*uǮݱ]a#RyP肷\Tǽu>D ,U(''iȗ. 3]t}sd:'Ma/;'9u -h@Bq@H-2:YX=<ч W^Hy Upy28* l[ҫF[jj4PM7|B*/76͈ȜvQ^LI%S[l n([z(H{ƾ #˛{Af iDPg 4A7I%@?_j:țhc v.r%4SBH sZ°~/v^t-:B 'Br4dhϡB*,(W*Rdnieڤ$BAO]L+bj mNF,n[0PP(3P'Hq6)g.-P ̉:V6¨Xyx:Z2)(Cr%r!\e8J v% 5u`9\Xf\7۪K>jRm3tIjW-*v `Sf#F}fOwU.4pph#T\ M5hI/%|^.JWD#$>BvjZ,4i$ 6Rc͚?= EүEb|`9@*c [k/ͧN&-9co7\c-ׯTƚ׻@go] 97Z#Nt; *Up9Z<'rO6V@ʕ JeDwj:y]c~}d} Csiq"ˤ;񧴭:aV,cnƞ? ;i<n'e&꽗m:AL CО.=d~?˘]>_pM9rF90謄6Ճݝa-W4᛫;{[\Y󰐖}֬n`(Ÿ`f6x317xZilkNU=Q&ldQ[z&x;&fre U'p3qsoXC&߹[׳,cCܚ`Bk8FuHqg'U c4&۟rT `Foȵ<0'SH }蜇 a(eeφ:JK4_O8P*Z#'KNv5o>WsҲk~Cy-ulZJtV]誾iɫs2jJeY9M(*#iezebw}Q'Z/?-)ñ~9`b0$^]JWt~Р|ŸM|CdVpo %NUEE7g:4 uj=DF7L<~8s"Pi Ż_w7d]z| ŕ1 o(+f*@#^j:$hm(.,M-g#RY1"DJYR4w\=-HdNJEŸ"6:Rcg/{6Ѓ3rNMSz8YS3B`?`۫[\_{[l@MO2 `]9=93jU(1*RmKleB0a#3fcqf{mH0QҾ -z`%+-gpe:jFa\HqۨX(FwҢbJp~Nnv:X2nij<~J}-)|~іEG@ߥ/z{1]¨O\_}{M۬0E4ow0[9Q -/N[M02b~noooi[c[8(Tߞ|zWrjz>W|Vc3VHnT e vq6ȕrNoK{"Gx]gb*v1+YjЍ$Î"` f+)<%d\^ <؏)LJk<cr %W]+ (e. V))1R(XFJoKFڱ0PWqAu% Bz +JxL\Ud {^٥*"P(z #G` "%ĚsNs% [`Yz߹^]R௲rZ/{/>[2cqoJ|F՜>"'CJ7@7s~f)9~qFqf;l뱳Tccݘ-)@g[ dkGODZyN762AƫuRHV_;3j@bpB '16 xq ̨']2dHC~ß~T&I3T@r|< 1VvI a<;|%TsMEŔI)و裇R5YXTBJ+^EN/!|݊5ޭ'J~ԶbTf 8Z .&l6߬ yvS-t;-iǜu̲i4N ]ΙHG:&92e'du1 1G{N(z4kDXsm!4{Ɗ8g J<>J9[0#KAf,AI64c9o\lWF(˴ !XK 9O/`F)s׷dњɕیBr0~g@H5uZ ׇ}]R]UE47nDugݍ:ōAԯZԺ~'Vm+?v)  VȊa4RaeDE!zjHjZs+J%O%Jqsgw"5yx~봁# Z *4{6ړ_%nġE֗QKڊVzRaLſRJia5RM9bN=t#jha9Z!v`IVIN=F6"ƽ>%ZKe@bCi0pMH'fdL%G1zjyY j0]E+F-ELBd H~'|N!;qq;$hw"1wh9T;iJhlr$StL#8ZCߗ&%pΓtF[:gjfguP7܁ث~km-_Q%#' r7[j\ҿht H5xHn>Lޖ$R=~24D3HW U:t ] Y[С%GeTd < IcJ;V2ZlACEX[`Fes{O$9lYe7k1+:ׂ6, [ōY)~nGeq鷱_iVoJqk#G/A,8_ᐻdܧ=|ڕ%$ Oqf$^^?eopz/3)agm""ߺ»^oY\{zV׏e̲eV 甙nQyi9Z?|!?k@P(7:l=-G HP tʻm}`e' =c` %Ǟ9qW}YNZoև|r>ӪyX]]^)J]q8{B:NluK*;JѰ,}~E`%PQud(cĒd 8$ 1i"hDb `|ڦOnL@wI[+-k]PJn s jI&aI2r,e2Aň 1AbHxku\qIz؆+7j*dj錕̦4Jn Lفm#PE m8KGA$dL dT݄`4GY̜}e!~6Bhq3O'OeG"pD.hp+D[$PC'  uǩFv/EK /U#Ѭ Y<&o8 sJZKhWk艖^s\E4 s}n93(phi~N Si._%MK T=3DZ{v3TL=Ζ _@@}$ krm ^ep\b x#( 8C$;x0G&RP-w+`>Q 1:\p,wNe̪\ rZf;|#(й=;(wX1ȏố5+#|QA,%X^GgYG .(XOcrc̻X*]4)%8KwXYZ1]߽룂,5V{.~mZOxݛLS&*P!Z@X-9 Ѩ=ףU rÛk"?I )l;7ϵ47 ńF\43n:lVM%r^ JXEu.,8JDPH,h@D Ⱦ(/"lWc$ \B@L|`BYeIUJ/D@ϡ%֊fnǬo(*cg1RB Y) FG$uQ'97 sp&&SDpm*ZP4Z>M!2ϋ4_QvQ.àOÒY8Av IhD=D[7BAg.FgM Ee$SITM$pω0Liv+*;"8%ƿV鞋iƨA]ł d{D 4Z{v3qQᒧ4;zb[\QW y%Q1&ƩZy^OȽ^ [}(ĆRhV4Ԇ<L n,Yuuq?t[?Q!\]HbNOw*ew*u0>_S\ kp+JE0{+'8|4WX GMhBgC4Ee00< ?zbõ0d=SI$D}젹Bt`!I,:MPdHQ[ID=~)/A$9c V{ZѮQߎ2nfipotc&%Ƥ1j‚K hBAXxB4@JmYNB"ךM%p# _gADC4xr )u!gK>O*Dz]|['hUpC?*t0S2{o+ GN䗬u*QQ=VP-<[Ru};X䋝ѳsq4rrsƵ_c>.Nt⽳WU 99AE` GN#זX%93 vb&o ]*@@~:\/8Od`)W}j$8U |ej+$5V:0g{`-#+f62ˀC904̚)ؑT+YZLHg:*f,Hc a!HϭUF3OhIc]j,i٧YHIɯ۳V>k 6tS- USrcK{֯zd{wg}=o\Uxkl܌on{2/Z6?<8< M_n?䟟s &M-.7 +y :\IBsm)O~nh֝vkŠDtZQݎh{Q=vk&T!!߹6ɔ'o jn#jV JD5툇䩙vk&T!!߹nɔ))sB`Y*,=_nnn?-wg^uAQ7_եX"͋n/n!tsC!,%KC$Ȥ)\"C🦻]~5w6 Wںgj'%7 Fl$L0yNQS .eŮ <˅|h!GZhLp"B \4-Aoϲs~V:j؀-|kn [<7rGr-҅ )`3fƦSl[$Ԙ$ʀζ3OF_Nm +-G!=T'5!Vs wiS(M-j$w:qW*LϦVJdzϗI-{n7l<}.u_H25B}kfC Sq.R=έe%}߇=hnRZXs" fBcd9%1_"D%䉗 N (xOqS??'D[r%Il 8Q'ZuVGLmŠ$5IӪz6PahLi*N~nʓ(GՊAQx;JBv8M^Ҡ#Hw.-2Cm-]wtD֊A贾FFΦݚ -ncHw.[2Usv:S`8<FVjp6u*{#Jlo/J<6sJ> [+VdH_§r{uGKсlx# w!]ݗC^$=8e)7)= :( X8ܡG1U"ZOWzŠPPL >V1lud *~Y]=;Oc*Mf~^iQGHrŃWYB^XI&Sr,pfM X>IArioULUaeR@ξ*="ӑ/sllU{RӮ'5*'^hmvВn^?Ҟuߛn^h9^6򨋀=G -Nk0E2%W/G@g4L(#f}uv#+茢a*c3ijch=͊%rrh7KNBߨ@U.w-k|D"#IRRI5P`QTA-EUĠ>y~k֔8b[+%)Bb;ގy1bk| Ɛ\D[ds6L5tggyIN` 1hqaZy[2;tݲU:y:cEvpCBs.S\hN4r F>^3;ٻdWd<c`kه@ M5`46"MRx.ȮQHK3PR$srͦyR7PVXM4e;h3_3r-,hVY᧵##eQBuhLuOv@f>ˍM{?܁mC# 2($0tD*2bHiD"d]{D,b=z^!X}=$>f\bشABCe!Q!tBk61tA(e(=O}<@ 6{vEQCBA 8{HT@6`}+ZA҅6@S` ,4Idf L{M :ԃQ1ޭd|*@r[= ڧb0G!|.1$ sŻ}p,B@"Æ^n}7iu"`! ` s P¹0f> )\.g0Dh 2RDrv'`rK) Sa=0{S~)' &!̩(m&0.B.rqR1ƹ",q)ۡLjrW< *BE&5=b\ŭ&DDI5)TD9]ǂS-Iĝ)J տ>D8J'擝ctO/t!m\xZu%Q+qhlU8:Gty .2@6D3o-Z#/1LƯIRБz#^yU[睷n%7u}.jOJB Y $QKya;3^Tj-M;ú]LTG$!UryDVΆٗXA-5/RgP;򎗏"uO&Y(G*iĔ2fRE,8ҽs2:[|J5 lÊuϝv/X߄nJv)Oc~5 zp k p;[޴@9ٽ;L78Md@]bX(R:4,JXǀ\!(8Dp)IIQ`.aRd{d.+%NŇ U5 em#mL@ C8 Y($\}x.MERl>JVz.SUpp4ڪo*t>`t=x@lαsezi錩@ࣦ GJYs 篮PljY! [i$™!14B1G1F`!e*|yCjk";UP45w7GUcAđw2*ՐN5f3 =B 㦃x n/}sĽBۧ:ո_!  K0p.K~ x8:9I)M e$ME L0E S"x:Qs+W7Aަ`۾UonZFX]>L_^[ДPR U(ܵ>_ \5pנ5XbQ(9Z8KD$ &IqDL\%63X85E^B*<>apjuS;T\i/Q:]!sc2+n`҃nQ:w aw5oܯcA2sfT:J!pzg';fwMYd֜-ŬZzw]|Uhq8^VOaKۀXmŲ8UkĤ OR*-,R}V|\N|nG+yV1^\eRyᑹ׻|X27ߐ7 7G܂K?!Zӛ>ADDFO,}ˆbm/b7}ЫQ<_T35GZpࡱBKt{R`B J?sԄ< ؁Z@`cu&X >9Nd`f )FH1"ɬ*.Ig[uXac v[Oo޾\n_aI nvb+ߥKOS6&Mf]nKɎ /L*Xo0#`$U 5wɮ RIW#@{XMK4<;PpId)j)\ ,KK=,i$ &?QXuޡn}*yj,6dmIt}fB|VXzW wҒ}tՁ#M, ϏV!3U$AH#"ۋnrG.ƴm)Wz/Jډ 3Z5Βkz=Ԝ f%:+s,o(Y2ϒBkֶzctl=}A\j R$i>~Cyu'h)GAۻu[7. 䈋s+\X Nj{?kKl6+Qb5is׭[2.P=0j _:ad èuP1 lsu0U٨ź)7J#u0@TFjQnj?ي.n k4k~v  1o^FCqBRkQ>0\HZ]0m9#3_H*2X@(Tt.{Ǩ)/rv ՘* }MzRq׊Q `Lw:cbaU Bh`GĉlaF/~c_Ufw 3] }FtG3qbS F ܱ|;xN^;t`n}0?* ->B@C>,G<}hITwGAMi}`%J2j}|PHuaqa?})B9;(9g)u )_lfY4$=h@@hk0`'P6fxGwstJKnoi/Mz%g1_.|Wfh$mh*ϮLy)!u˗*3dAil!Ld@a %xr4tc0\F8:'EEE ̨-'͈]ib~R{RmHsSYɖrT]!呪-=V!)~*^a3N?&I<:`X30<# /1Xm YjyHd v$#75RpJ̋S{S^-_!LHe+T8J%s YJi%ڃlw_u-;#Hh8kZ+݁B//r`z ]y;iP׉);NKB&Y[rMl~,a[QqZ2EHCJ(TA6WA Auɧ!'BuIP%iS\2%FN%- E%@ge\XSc.ٍӀq2`Ʊ>YR' ]Ň5*8DyNNx8 (Q`o8 *f8bDž3u\2X)X`.YֆK`i`]xiR!5@ɯdy%*՝]czv,aq[%ܞ*IxFSCT,HJX`$R4C$f 5!Jwت>VLhp{IbNTnG6.lzz}c/V.27z4Lׯv)U\z~}&\ц/Ŀ&^PxœOap> \ tUcLx8G,3˼We8B@gtcWhn~ ޟ Ƞc1h4(2h3N"B_~̘$,n(nʆ㴗M so{v6GM8M8M8M5J\|"J"i>9@e]#pZ/G#npGWWW3xq8ffOw<46!|p=( b ~xaߕ#<X#XXU̎G!FuU50`pȫQoV7H4Tx$ltUYG&T!cB0%YqĵNJ3 ̷Ĉ҈)`S&Ƽ `I=}gn1֫GKBZcgc-"R8±u"b VE\̈R{9%~O mf %E j#ZPx# E8:b(!VF "JȜzTLf+o7K?Lb=i>ω?< :(r"(Cv e L꽌87]9Q:|WLzYeKaP`Td7 G*D! :TJT=ʐyIsZL:t JiD].QerOTb("-S^舡(Q:ES>w)֤ljD;A/ua$_XޕuKI*, ~#IY g2:Aʯ7r|-[NÞ7َci0\Qˬ'-H e6^`ZuG B*]c9 UGT "٥$ 7 ~>k|{5ױ",а0{dk&Ohp>݄2?MG3xkYdxDDYp0̹w^LY&@]g3!,jr+r8+'W1B,+y[9A/kq_b V37zɼy;T` :͡Cau٥l{M98.@uՙ{?4_>xvE` RZTmc2P p {Q:hYw$dΫaz2!*GO'b;1|!SUWrQ95ˈdu꣄b4T^37Si98O8V3|9qv70$ s6 ΤMYm>߷5H~^!dA#|=ݎc4bumGzTU*l0i:2 7BBUb#2RtU81vhx PD<@Q9/Pר|@qF{ϱPIU8c GWL=M|`kX/CZ><٫\/?n{^:0to!/ +q@2__a{w2G1N3y2\ ]+dA$&fzc-6LIi;LsXØjB&0坢5I~J|1 fBdKe~JbamƅOZjf_&t dOOT]U?dc9ÿߙF`5~h~}?y@ȐrGGGr9됦~o6kpů7J9%Xd"Ku5`O&Hl2wf߶ ޥ_ 4~;wW?c7~?#"^ȼ7? WGWOԽ`qZx;zPէ?_}o|)$/`v*R&w3?7cRsޗ`h4V/{.:Y"'Rr[&X* #*VJG!_AHyA9I&fWanz5/fnVo]q3!m(m20@|#v|*1?Ηwfz|xIU/'~5US2kp\;00wg,yݛjsx(@@,Tu ; Ι< 1L0FDŽj5`X;iJV9+pɵ&F*gYG6Pp>iz1U_̚jgyVW[#TU[<1jd`6ߦCۤYK6j61qo%9G7-#Ju^]kTV`35M8~0xG2| ^OTN:p^l/jifۼA9F|I$]?OʈY-جteO>$ftkOy% +is3 steˆX)\;< eRG} ưCPa$Jl8=~;:$b@@9h]hFP8eS8tHr9 JH$,W҂W2)K!RU OI8h-}ܩ(`QRX/Jܢ$ rn DMggJ^(S޼/y_[aMFXD0c)~t_{Hz ؉0@3w P&¦UGջ]UJ:ί6uz y<9lМV(ߧi}~KRPѻDKGbU7X*~(U<lSټtrB0,!a[a[!nF%g2aO^yX1!$.?{o4̙u\T6Cu7o Y /m'Rvל3Cj ހ>h:hvҏ }nRTI;FUƘF_P N-.7cvRH N Ɩ&91@C>X˺rГNbVlr0YTI&j y/&}&};5g-`=:vOl\U>&CɕP 3tn.qxɳgK6$|${aL4jr%3Z?JQb+ej6?-)dmZݩQ~i!'6$$寊*.%m\!^b96BƄ`J[c{?E? _f_K⹼ה̺@Yl {zVg5қX XNG*,4 1.rG!e6v`695k>B0kҍb赠hl/wfnUj-6M,׃|3uiV~-A0fI|ƻau2lqxUQre=Nsr$J߆wy}_O?v9_]KGW8b;ξhet e7BMdf[Y N6X9=Iֺ[iAZ*4 WR:%P7bB-բu+A)F ǚ]#ۺ4uBCp-S#kE" rM%(5cN3vk w@Cp-S3qnwº mukcM̦|ۺ4uBCp-S;5rlݲ5Zne1ȣ:e(bRƬ[iAZ*4 Wы*[ OJGdu$"T!<<>~^EY%04H,}p^x WQ$679{c2Hy.bBzx+Tb(S!wg@P Ytb'lrh;ޙu8<|}<_qΛB s [MCV߽4B]eV*OφhrwPf8Nf_y `>6e8&-p,5Ns +eh&Tl՟nӆ(SdY,xbmڬ\7'[*Ϝ᱖x{h)<%ds-&i:a/M8N6IU '47rlR˪f.8"m],&.iЌ+|.Q~z ʺ5h3sS_|NySUȆ[(3i5iv^jGtytb-d#mvfI0OٶLFRi!Rǘff|@]WЯXv &.&d 'm/SZ+1˩*)ijs֊8ouat9&쪛 2.PסNiM*'lbZeTg\W>8QU!Bj:nf+ݘӾ4=X "X=Z,YN;8g2oR^@tHNq`$]T4أHӎO'L_ЅtW;mgo^V^gV;0 *ivj3FWP[MjE ׉VBjEjTsxQRjqhճEIZFխ3(hյۛ=Y+8c>Y+F ՁQPև_H**)^0PRftG˦4AL V'ϥ窓֔%8S%,"B!!†ǘ֫Q<21aN"n%k)%z qII: 8IK1ؖ>j 0Q?tw&8SL1Xz;R$jlP"i iKy5W!_i'H*p30Y͗7 }UXZ 2ӓj~qqofg ƃ*VS9?ڏ_ G.RZX Sh&MD ? _!Arllj[R=xLk-R9p<9!Nf01夦4B P 7%H|0lu~ ۥ'DP!:EaĕnbTH=ksHrۜ@Rxukpb f2W|l+[$H"@cUM{z5ͤXխqG(K<\tZD ǀ O\J0IJyc Q+W\4PBlf#١LT$VICenMƦț_lm@[c#@4Kl1F4ը/eiIA{H9HM~A7 @d&_F6h1"n2o`M1̯oP_Vk+Q$r <@KA(Ht Q^um<{Lkgo_&/e`^dwwvIo8Y]>$/~:.~Kg5C.So{+i8~mw7öXFMˀgy v-"}2'C )<2&IE$2Y(͸Jb, ʹN1ì`: F~Pr=1|m]qi6_4[ ""D)01mFt%FǼ , 8,/e` wV\zN#W+{!uF ͦ0%J>miC4/`'cna3J)QXٸXP{SJBOW|'k$:% J n1Ml%3`u;j*\P8NxM'N>IP@ :5pWR$ [jDJKeDZ<bJ]c%z8Xσu=>*| U@35/fvsodQnY\D'X#B$Q64KŁK6<-<n 7*ԉBa+,^( DF S&`HS>&`4pT:?԰رJ,D9HCt#N@Gx#[0TjhdB 2^Jc၌Z; Ɔ$: ΢ FWq1W1FO]C;P81.qÙ`8] t6=鿗6y7\vGc{G%tDN1FHd`Cck8=[7O nQzG%/||6&k s3Yjp0Y"-\P`#$Cԣk-r( WBXP<&KT\%Əy* ~4* S"!|> >%밸QŋZdz F!œkg0Pc컫:9,d@K- &7b kB0=/0G<#mN$≻CsVQpߥ.0o(~XP|u\>Nx.{Co{UZq\4 ϼ OSm\0+l7[Y2$阭*]qZԷ-h8,uX1*ß@pRXI2cSr{ -EHz᫙\:&Kp3LC*2Kfn2ts㗫Lpy@gGL[520,l ߦ.I' ̥P|'VBGf>X)2)I'Nj7^UYITX w7!p5'7ooBoֳA:y`5j0-?%'hkp1 HG1uNbH,Ά|dvv\`O6"c˸ܵae^qg-_u^j/Z[o3-,vTmvA|cu *A4"%AZYpd[+a#/olaʼss5;?D.X.3Yo4Up&ML_fM&i"pV/,rl2YDlQ t1HRt7@czp!FYA[/zi1L5dzU͍)-5f/&yC+.)А)@)R`uX~8$bHK"1k c*eAoW0,0),Xw`{^^g$"fͮZ݅!| 8b'\0:R&2>O[wr퇍_ܸ0Nb%K3+-p-"(h'D"<(ZCa17E4Hh1PH kkCh(AiԂMfcw`MJQ+$dVFWw~)/LγGN|Gke NϕRpFmB8CuكqӢ]U\` 57f U]Xlߒxm!M*Ec7Q@T%%kA/ ꎄƔr6 \vxy. JH6Z3z6AͽIv*!@_BxX,F&3<6?P a`q #Xp(XfwjR.OZ v|OZ~%U+~A3gElvk 8(L;d.:H#3"q& i-Q;at3\ܤ!V,-Y~yCx<}ȫ[45^O}&)5a4l/o&9 ߌplY'j*C[ 1 &u׷yFgRڃo4`w!ۓkEq1\oydN2²/W<_s\,e̢h*|&{C8 5U/j떂(ǝGULqpBvP8[n(/R~"5 "v$'"uIŻDqM0>}B2'_ѺXO9\i.8`c*ǭ@GBߎ+ibFc!- qET{ ? >;pYIZ⒓@#pQvZHf2<2jf7`$|媦!P|ABh]߁ *d|{clvƈ{Cw;|[OۊU \#bU Ll2U#MR*g.p?H%Y7C]%? cYCĈWCoB\&olM3ɜى4u>_E^D _ W<<hLSTV< o#MqzpLրT`%ZxkZ 9mm7wsܽ`dZOFq(wxDʖ LzX5=kwpwSg[-x[{,i} - 7qZMN9DP)'9>a-d8S ]TZ̫o j.킗Kp}7T(M;Uj"taJR4Ҥ Vۭʣ'Vm/T}1.a2M'݊+\.=Lt'*:A!VeV6w֕rʢ<")иV٤tjt59|,FR#ӗ!y!fs-OdR/dua ,XrAd|;)~lg4;[kE̺{s3c#]M\!q§Ͻ+1ig`c8ttFO-lhyG&=kt3#.u*+v1Gu*Ô{Glalt1j0IG,w>Uy" S$5ykiaCSתVG9)=ѩ+فSB uf>Ow\F??ĝG/Lia[#FYDOTh ^#s sHJq*,)A 0O닚"fᙏ δ׍fxc$fd话 ~fx8(}f%oy3o^]:&Y'{/fxw[!A v,?,_kCPצhT]NJo}*4{,^GpÉ7A!|R':$5EV;\+x,{",Ypٻ6r$W}m\vwdF7+֋;Gve~ck ĒbŪbyZQPa]{UW{]3}pk-bGa04/7ٙ6 = 5dVvE}VU]vIQcvrj U]pǬCqOTkHϥ"n!X\H/1Y_,ZH.roTROv.f 塅J6T׳`oiılkfzcGfٚfQ r2 ~ԉZh$A@)Ynݤ})sLc,U's3_+ p&'iID(bT -bbP6P3HL;ڄ1K;Ykqڿ0!y%AZMoF;Z#WgY[@56ńx֐<1N/w[E%TQLQX W(u`1)zT.cp[<Ƞ n!Z ܵS ><_-4xB+X9'6tc[5l1tI5zH]YRVjϱRޝ_#}vXnR0 5s>gQ8.F9F:C$Ќ!4ʽ.M|nX^D5D 槂=/LY/ .q /v"D}~m=H=Ư3f6~›R\hʤ ޳nv',˒i T~u#C,])G9`5s<ܶga.r 54*=+\Oj#/%К$+OlF"a3mow\C e|\C^,)KmuDVst`Ը)VBHb=[(PF#S()rr`jLr$<46!`WtWl\)5#C[f%ʇ$Bd}+u3KG#|{%ioE&L3D:ZMF S>H*⸅:6?uZiu " aN'KC y1O;dLjK1{qxEp7W3ۤFg-m=&UNkxba(&ܡǚ[~ +荼P塂j6B[xnD؍o.M#.=+9C Ұ.[[zj j7i]C^=$௱@*&.0Z oCaQYCFkqbBMTB57(f(H4`p>Vls..^אR#{Un!|}nuDF<6x^С6[`C1t*ȕV!;s]ZL<SBHPkl^lU|mPQo;_r xq7!^ȰTWz^v0#DIܦ - Iaz[m~vSfX֕MЊJ3  R:V- U(~2YjCz},I09TSߍk u (Aq`Ph)!dF yfFEzK{PقYcW;%ڟW~+Ubf璡^о}<&a]Zm IH0`6qĕ0<2͇V"En-Bi1E.Q]^PH!^z!繄*Qysľ|.6.^ wE(ѺnH ZþECT 0PAt 61e]36&l"l'f![p6SO SwW|n!~)$T!F hDqBQ ]}FSQ k(w`Ac .E$Z  0U)"ю&|+cmkP<ȑ^V!XecL?t U_AC~1gһ q4{\tl4>'E6MaJ=6@汳 ЪC>uX}匁Cen=\|E.$ cXjp1X[}9mjuޏE2 vD=T%pH}J"l_h+q k@TkWXh`cQv4ǣ86(l}#VK]j|'-F99U^G|;\nybhn>jaPS٧i#ͯl2Mvmch|4i냥 wl|?1%|Cաj8˧qOܜ:q\p`ÀII$d# G%WC~' 'M/>^m ʶZMz4hYˢz&ѤW>k5g0hc^AEI!k柚ww£^5R,ƩcGR2uڳAs]'#v k޺|r칯ԢvUB]{ۂZ%&f=x#@[Dړc3Na.:(DC:& 7ײ z!7姜\p@< :hħ+cHT(tcl2ZhJrNlژElt**4!ȿ Ou.&l@A<tBڷ=ǽud5iԻ%cS-3IGr:io+ wtVӱՈ=&ά& ߻ds)nnĬhx?1&_s`#+0Yʞw |Wї*Ɋ*389ɊoLb>iKrc G:.90Z qZ3%ו|+Gn(5A( 5(gbM(0nB*O'2%vROGV.le ތ/у=wʼ>|3OLpLTUQTx3\W@ke>g TKfOw^Ӹ rO(ly2H"]wù cf,a׻z=5 ˯!GH$"%%`pz 4OPcé&XJ m04L\g@%.|MD'ڭ~^C8([oq.?=ܪtY~]};ΝBbBpyO Ioyoֽhx'w!; "d4O˩~`I8`A/6l3mZvWs'Kұec۝'fQDII%5a5R_+*5 C@:T/,16 93LR:L0RLG1Ib1C42qThSP\ ~3nߎ̪p?uU2W;Kr3;"|ӫGYkP ]$Rס4l j)N Zq# mN%јHFdlc ߦEXc!:>ǷJGȚIH&QO ]cRcG`"زL!_8Lg| ԎgVqtZ9EEXg`$ű9s!$62yQ(INvFYl}\dxkw370 Q<3ӛy zc:gÑBX)HL۰S)pi1,6f1Č2#%3@"Hݑ*f]ZbB#GT՝M97ohz;M`F!{pY&vDw0X+Mq% 9ܑx%fsV%+kOS7vtwgZFIy5̒ٵW^{Vᗝؼ_am g;ޙ8Wt߷(ґ,E<9G# 0ՔH~UXU"ρ* h0%v|Lo~m\$y=j#h;SϏ,ZY`i̓炉at|-h2ɟ`gmI WZrV">mˬ,\zM܌))cHmi7 N d=xԲmF7i$b#g`O-^=22k޼[\1X=>n]YDH19ĥ$pA&Ά/_ Axwwc<3 yߞ_eDba2p3SO1o&xl^13eQD-\ FH'= b`nsfp%>\X&cmr׎vzP?c(FK[G١MGyͭD{9qXFRNoDY؄#2^FYp-! ׅ k8`A(Kp!M‘y%iXg%JHb}DTJ9^yk$LڼK`oBbN?Yԁ"Q1xqB*%%2"$mo%{j4e.b)pbD%o`B &'~K/9DQxVC:x)+#ŷ1 &aT8" 8 !(X .L $3&h(^]4Ʒ`?ԍ. ɜ `6"yN]A =m] 'Dm.עUd=5AGOʊ`݀! %K"-` [b!f):r,dT>s`}d&ruA;%4cr"`xFX2;Z{-BH;LZΜ@߯w ql.tTɸ~x"?&'Z;pbHݔdbIp0H%'0^Բ>?K^4pxGI5`1f54xHWĊD8gX~q%3 V82UL(XEG5hT\ ZqeNw+ AN2KFQU#nOO6.K!w'`7`.jRCgxasCclQ< y wGsg :')<Ӟ!0>8G|.fѯC,4VP eR@N).9"LçbIj"Ck7¶C(mQQ1z xQvC 25gw{H.4s*l=ߘ߇(o/m@땅[hϳ/'p/y7f>"Ispqp}ۆqqf0\ـ=@%)be:Zab foauH3#☽u˥v|J^|5g 'm4I&kI4fv2+wE-X,>O.C`0>&_z1?.(ar_w̾:y+<.*=LW`OaN./1_)`%3T7ug-f&hRaltQ~Ur-A^Q MoxN7B5,SDD+c=8*وݬ-Wq Z$sއh&?C͹lY_ zk㣻]u7]|21XXaIs"M_ٵ5GpV bn03f#f #RŊ9{B],>SJ o-m~_&xQMa&\*)!U$ aUb d`&1 \+P%JX}F'ŰFfi+) Ԅs;R`2N5&/5Iy8?mRϻآAzidy74֬|TNKF5[AXT )U1W#nO#X AY$ [ਨ ؉/B+)|{Jf-~4Xhgb9nw1f/aȄujM+YJa{.;/^}uxs?`dp388QϯicO9ۀ<;PvW`NY~itLK8^#fX2<(+ѺWD9's a&ZG  'L{A#i ҵZmn>G]iKTem͙WRV\\1*pr4K!iMӚ7&JU OE)6N?"drK/[RfQ._ͰwrFv.O`'MP1~џqƶ)=m|aIIyτc">@89}.ғ5'LosT}JjkVZ1ASS ڜՠk^ -]c@O9Ֆ> yK!ר7?|ȳHcKL tQ9nt3j3 -3_?jNkvgo o!!rʙR+YWQ9ƽ(>)~S8TDu'$U )Pڲ0j=":cu~K0 Rʰ;5 Gڍq{M-2ͻ.Ӭ2]vK3ui3iay'[/)&B|x2J[e0Ik7t D5Ė揜w#gd5[5Y2\ޯuw2,S5Z8sè aw#e5>JÈbڶ'NW]yl'O~Ra޺wb0ZyRDG2:Ͳ}|Ms0O׾dO^ֽG6K$QW)l@D9EF"Yb?ZlAxOK.̒E') pĻ44h?ZVXo;_pF,nO ן{ S|-ְ'{5k^ Њ]J-On>~Pi[8f%Ǘ֬cY#Q8豔04y'1N;w{A٪,Sxh S;K;֟_ȷSpn^qGnٷpS@l%>K\;cF3.?w#G=1nhLBֽKq0k"/fk 'K)C[??8ӛ_ë{pq0Ln=^N_)7|LBE-qZ 6'GuWy<= |ztf/Ed{`&m $HE'80ά8be\h0–ia] vMtga<,ϓ|]JPa׋W_%n )Lu|^,v~1_ .,T DVy!"rDh΍&QL8Hz2ԂeB BA K~GgA+ #e׮q35aL *ktp1hyD2#P%:2],wQ[1_eqڄmfB8SZ _VSJ,SzQY^fayY1o#唐> @abzMx\@ZBA OD뀥Ϻr¶>洺6۵6R/Cʁq\YQ0 *C@ ʳ(Y/1=WReEc3ZW<8npL>a gj@ZRM9|4Z"kQ.<5,]/Er h$אE{e${n8 l,FDsb<]TiZQx/̠I ) FP)`z>['SEo ̊"{׻ `B-]yo??^eiV+# QB1|2@0`@HN4V.I4i0(\ӍVd kdG 9ڶŭxqxBh-v2^jG=ʨ_?ލnf)!/L&gq4W(Lf<N4QZ9wi̓3:TL0!<3A_Ofw3 |=sUJMD9_;hN2j6a7M%;v+ GuJhn=&RM{c[hN1ďÏj4jUo6(U[XTeF/F p2텓Wa>]sH&AJՒ޷'o!߹&{E+K"gtuw}q4,VkBWzqFpt})zY4(mJxJi_iciEt{{q٥ֽKq0*ml Q¨(%H#^`QQ'qJA:Ln&Q2%#P0j%sZ "14Via\ c,xωXC4Rh+"`l^0jAc׽. (7|@@pAbgezmhwE:$MeNdR rT&v@dZ{kv+h.!߹`Goq>g7F$) ҋHѽ[Ds٭ UD83vQ+JŞC3u 9!1 ,`t@8F!BNK'[nVWX|4v!e;NܗXXm%ڊNߧqHIG\dp#\pFѺ4PqhIkqP"bI(?Htm[Ky,ZSuv0*~*qPTNQN?{a*~j-C^8V0kkb/e)+{VhH(%o94lh&M<7@=1G;!BH|͉b(^גV[w4Ah~՟CFLB 'cӬS^1/rǜWکi+ 1r@b@+E0 y+7zل62V̝MN+C(O?ECz9Okb8q|khϪݯ_B]}98䅳hOm\xdYhxE1T;.s1Z@z΢< 0R}bB _z〩6Oi=4Feg"N C Xj9*IMل|뽎&K&(iUj9[7-_툌ֈ.\ŘVęrF(HH#u%'L ẹ̬|@qRnB][Vt49H'zg(7]3.l @q-ۥ]D1NAX.qʰ)}p g*bMvCvAQ[NYK9a,LY6-ۡ a⠈u*I/.>TZz΢D/qix_7zNfvz}m枛S{!?8‡&) >¸dxs&d+W_BsEq| zW.k3V]I3eK];.٬!6k4e,v᜻RQ.eQ.eKW̹{?z? A)1KD@k)F iEI*[Rhm9uF5l%F\hBs}=;'zV\Ύ4xOG4xOGDѹCC0$)$H9OVmšXr@ d >_tn @WԞWqg\`UGEFMxs5>xῌ<9S= ̪ Pnz1z:糟n0>ΣπW͟w\α~>op(M'q.8,9pVÚs4ZSj!a&A &3uUֱ Gi3a&TZJ ‰P+τ`BRݝH\^R609߿yS=^u._q0ujH_݌[x;Y1ʃBDAd?gޕ,UސVl<ϤVjO^_~8o  t$)^UDiڌ(bYZi}h4ݤ(N;$ URʒ$U^;aIIҧ=A> l&O)TM|RBY,hRjmɎAXAV}{]j>W/$xUz]>XAV}~j>M}?Q ñ,Z`[z" U{'K>Q|$)3/'_ڔa=a3Ii j{ʅi@{#4C4MH#Z_NH lD^w$O8+?3s\Ŀcy $7Կ ^׸|=ңO4K[Vzx<\p}8 ~* Ŗq~]\XMi &g麟{׳3{K&SyoiTᑍ8 #>њgX:\_oW:,yL^pҁ\XtC/"Ѓnρ(fq"Ҭht $@y5^IMm2_|S^e0Do35oti&[Q-;pe7*lB%HTJ0/\w4к]g_"rV@T\Pvd+g@Z|A3gOm bAGVDi#DqfdL#ԆnWj B_-&$%Y6Oz$j yp|we)"Ԛg< rHIh41xi`>ši#TnW^Z=g^~:I2|KkVHvD(P*C[jrm0Њ[ն=Bdu-otknnO= | =/ l 1V8I[9}\g::t֡f%02Ar4$)*QN!@ |QfU>|cn6pQcO}Yd wZ#V\b*G/4!U8FHԑ"j.`:(0!LpK0"w2R p@/+r77V^rGJYB#ri,9[Fp$[ C(.$,:k:m`ZyLO҈'e2R&Ȭ"tPSD,*ŃK94KꅒxA@D=f!6/ rJ^E~Tooֽ2%Ni6@W'OSZzL{юm ~uI G=ʯt.1unXLTrֆ?L yЀA{tBۘjdn7"wX 1UL8 n/șFD3%  eH J<7݊jOr<‹"}@'=b"[=B#FLϛSD^?'Bt,W-3D -89ղeYˎl["N%1H(t5!vH@DxAX"kT%ߘнυCbLt\RŨ>1j%$Inb2iFG%]N(\nZ~XtXVc*åZYB&"%7 Ҙ w*<@3`" 8b9$@(#7M'iԔV/z٨q_jTV$^U!1qOG^1psJ4L9uB[M E;PH$ecQ|rZ;jUHΈI")P HIH@XT!~LuCk56+؈ FMxs5u8OcԐZ\3g`A%6OR#]|w ;D.WT޽޾}()zY$6rzq_W'\h>"l]n1!FOs Uɗ \;|ՇPx=8BoGJjq9s84gz92XTD|a9A@ȸM|jW-P8_F5yO4ڑHA`Kć -N1 !Ѕ%F a_6@-1ՖxtЕHє"A<*`$=b:`m_'`Vchq=1 P(UNF(Zc DFF#jTVPfvH? 1#qϭғ+Qbt?aO .,rU6_>-Ox2Oܛ4oO==fzttz:BWZ})ڳ_Ԟ jM$td|:DL.]/'5HnwdGlO x)4͚dxe_Vl td]VЈ|5l0+FDXyٟRbN.H@64Ol:tbت́m8cΊ>Ntۻatύo.fc4G.za^ =(CP=؅ lcZ ;-@zžП_c=PL*Y,ܪ&}lGޡm1Kԗv*؉D;Z ,@R!fxBqD68z:2jW<*GS ۯP?ȏ+Vc nmk2K9&:eǃĔ$!3o"4d#s]]{T9)ՕW`$hQE3Ti.إw<02ƀ'0_E& e 9b 뚝PvZVN04*VԶr+}:?JT^" a rY-"FIG! d@3I*B03v ەVL][w|Y&}_fOnWw| e=OGE?Z KJ!fjg\+H*$h|鰥s솪aRI9Q7OE+|>jO`VWl^Rg<[̻&8ayL߫'jƲWfƽ4uBر)jx e%zҁ=s~㝷K.>yWg=Kit77t410tuӀʛ.WzrGpT`OO{!6QF1Mm׼#ͥ9NX IoUh~7c' rT2}ƽjQt=؊38SqpFJiwV#/Z6+m ]Mb[p66Χ|/O Ea8[8ڋl"h'l Vؘ9{n c^d@Om\=N.Ls]锯 S*o_5tFnjn;=F^=8;w1eXgUHs %<{2ZסfQ/Iz38C |+uv %8ROX0fŰ?̒bIZ7#*͵D=)H:(|xbb' Uo+Oq"&[zcd"YLQ?Q껋0526Iht^BBJ»λ V0Q&pt +!\ɻ/׽r 07 e+MXuB @;Rx%T8ap>4kڜvv[Q< ̣xRyso5%v/5x=Ct3{OR?'''s0+̑ca:K"I1kT YԜ%pDe 'ߐ)xMTeמL^:.vy8jJfdk9)YĞ0ノ`.K¶`2Ei}c1rD3A#ISKanz:+!s)0Bs.˰>Ygk7<:ÂLq8ѫ{j\ݳ&S,$G1<>fikG@s˥ū+Z8U&]`lA0A2 N5Eä;MeG0K+L C5>x|4vHEft8!$Ê$ Vw#둦FVFPu`@8ThIC., SZf"UH.z ee~BfԲo2]FBomq ⇅A5%l} yh[%9y;9&1F  8@o\U4xYL. 5pDGˇSɀK}[T s ЬLwHX.Bg#0CgěW%hjqJ^-eǪsSiI>B:IF}h,[Lqn_.f[ܪ}6~ 38^/&_OfqaWI ۙmMՒ#e*9G^e[MCĦ`]?=os:fE?/φ]wJm{f٠;7A) EbL¸ JxA7Bݘc}2nZ;Oԙo 1xI׏u}AHY)Q!T*w6El'W jUD*YTDj}G=~O񨖜Q2\95xv99.ʁ((9zAQo}ض33:+apG}zm{HGbCW0H5{'\戵Z vZ@Q̰xu/N IӸe񄼲Aɐ: W)JzsR{Aqb%@Ua^.:eBr ^oكkP\aakK&Ip>Hp=SJ+(uUd&M@S gm5lM@Hѓ8f!J+!ez,tD33zՓ6\3PMLj%Uzī?W gA*Q >Di7j<Zdh5Nָnhkh(JTЌKUL5 ( 3J ƹJS8RγkNTT tQ{"`3N8Y>^`K'DV,B.%b#wx0͘ F!5Zw*S2EZO ;鶴Io*a|k CQ)lt8(JI& R^ł>?јBMlDF6.\6!xq!R`P$/|qfNeajɤTUVj_zEƯ.~H->O޿{ _,S+knHuì$O8c;,{a<)(@,w6/+3+2n= /!p%/t~u4h<Mw qBϿxB%#۹77Nۛc J 3%vsz 8MPKz'`u:z75?ms69vi2V>cWF ¥pFluq`cXCZ.&3Ga%Ӗs]-'*Bz1fP*Og!!S--2j &ăE+NIq`XҐ{̭1O|}YωD5ٵI+n`jӮsP 9&g \qzYXx:L2CVW-0 (Di0F7{vC⌥ȜƢVa>"{?,Ƿʞ\Zś;IeObbIaAEE%&9ڔbPb+fJF r84=o\nNL}}W+(q0Ygi1>?)ӹ:{71`30V":gbnqfjv ̔3JY)0.M7XC),̉9-wdL,?-:f4X;1W|nVbʷ(zkz?ظ&ml_ԑDiRY=)*GAUB'x㻵*VOdQFrgJ'RMQL+?ub(/._1 8b~0lI&_FjDZ)_ڬaYs(M3 PY#T K8H|6=_'@*5)L܅#Zacq<>PS!sXmqrdNㆳ*kBm510]?7f`1n r7jp)>LhӦ[ pJ%ϓxtae.Z뭆YW,{3~_n3` L\2z=Sb[)OLY~}r6- =S-j& x}3q2VTVV4Ju<@>.mPpЉ+x(ژ7S߅ED8r-RDsO}zz5fXѶoen[@#aDߧn/n'JںaK޿s|s[+iW&[ɭnzv>7u.kO[ \I6㎇cSdcV#bc=F~TBdlt|Hʷ-|ۓGOJqB*p-J~)(#NgdCdz-A [ZTȃF5Wc/^U-8i(o@[0U I$dq8mt0Hs'A5>N|ik]쪎s&ZKWY`kS@ŇE$'LSVaڡ V>EVVҵ PYntrR*-_%KEE+R&I+@haJ(ZKW B4QsJa!<ZˊW `4VCR`.\MS՞+H2KCu+EFrDP@ +y5U%(/o-X}J([%RVȤ깕 Do_wa^V#mJ1! pف*D{+veU_ S :A~.L~y؜{OH/I؛Br, $PBKVA*A^eeKKvI-F{.y;׏,KdPZq!^ Yizt_r P@+݀$zo]@duE*!Ɋ}()k!@]rtR8A `|.(28/n[p9[1.[_AeU-O\^vMvyr D\B!DW_Bɻgip.E6ŠWhl/"cN+v{Ѩtb-9\{,/_]X8%.{qپ?N5CqҿS?)q :6ǁ|$t&Ùf | 3J?x?X? f4;(^ا5؝k Ztq|;g+nrz9c4v(Ĝz:ٯ]+h9e\ 6 ' gyNe0Ͼϯ:`+'gÞJ׷onE}n $Μiͮ91yZ_bHׇO~BM?T}c&N$#^GS EW(w8'T&A:1R$ӄ 8 0J$ٸz v^]55mh<;E:v4aF3xO zM73h#%yngR }Ow^}6YY<57,`ɵ7"KznuȜb JJ(sJ ˦v4})|r/͋G_DC9>nkQHnHs&T\_eC 4 $wG6fzN uL2gNG)!! cX+1X[NDr,(΂!(R+X. 6Spn2F9v[/-+4Be <%AHSnNfI0h8oPKZg M,g DLLr Q31fށ$HwYr% s$ J*P!j9 Jrt6vdcocܤ7kq."^؛8DaެÂ58Ie6.9Z6vդ3,E{FHY03N{DbJ@k46N)_!+z~1alb$)pn~1lbD)gS,/lj6+;GŔ˳i$To3/f_9[@<[5?Ԭ_rO/Dƃ^0qpנ<$0Yp>G+c3E/f_Ƌ9gs"/T_̙>[s}~1L]|G30wa>Zyht>{?Cz:j: 1j|wagG㉱az m,s Ƃ88aO ށsDg{0 Ҍ3 <sbotP;y7|9::YcrsbU^uMLC<6g5s*L/V@?34h#~dT^ xl93?Xy۟v;tCJܲ_Cq?l se5_V]otE_c8Wq? WKnwݚۏ9UvK?iVP!!g.I2Pֵ]נvKŠ褾ܾmTڴ[Y҂ڭ 9sM))[`]))ڠR1(":8D5w8Er&Jn j2$E4In5&0jE-"C[s">-=v+CB\Dd֟-\nٻ6$ p: g{.O6E2|߯zHI#!ű"G]]U]-QԻUĠT1QwlUlS~[_ej.dJ裯A~l4:BU ʈNwlcݞ\hfݪ,iꐐ.dѦL!IۺNMX m t[]&]w=gmeYk ȰS@t"ՌҶëUMT *ۚfVM5 Pk{Z5ASִ@k|s[ "jAR5HLՔnU(o`2 19Z5X՘H96\&HšcŮ!m1ס  B19:5 prNscUD1ksmVM`5.L 19Z5ArTM1xjsmVMPFQ1L0:`13E1üEcns̵jq9f;l9旐cyNx3\6kc&X,6\&(3?mfMкy 8u19:5!1tcns̵j1NDcns̵jg 1y=kmE䘉d 1S}mVMhxϙc6k1SicnsOi.)SGsBz4_i"ӫA0Izh2 Y^})CRNEC2 ,8J[d;.Ϻ'fr2ۓO!ӓ2O8Yjr HeqSJ kfHA-LJIj#V72ݩBǏ$?O/kdDqh.(p~eWGg+Νr2 (NR?9s->!=xrq)UX]@#(wo]Qrع&0KrdJh1)_^ٌ N'NOHpI58 F-TEʙaRD{)=!M8{ U@UmrfގF;u&0ICֽgz0̦Ӌ0&nw}mf>UӅGS{f"2,;>%LJᨽV 4rf%p1P@;TU9*!G=ˍ0pm#XXO8w4EZK"bXc#R=G!TPI|RFCEz{B}~dS3pfv RCH$!T߀"c %d +)(%J&:P/RѩX B]i,iY1\eV`Rn-q*(*B7<'B! !<-QiM XrG i>&Cn1%g h"MрFkpVRa#IIIO 6BǓ hYwLb{U&/`/GOQTbQƳ9-9VIrV#F%_,ǁ=LykyH$@Dw8D=86LF {k"›L00dJHZu ܴ^v ẰѪYF"%0W^JX%,b|`||>00t7糉qB(E~eTryu'f w}pGn޿1g2'u7ﻅ #^x_Wu ""C4+O/؟LgKn>):pA4xLgk>5|a!Uv;]\0>~&eTH b[GX~ j%T0_90QXN;fߺC*e0eJtfnyxE7$4sbfG Q2ݞD{ c\Xs5S=Ό4căV FO&׵O^0Y$`3Cf$/TLkzD >Uz)| (Tߝ;tKIx3 ;~|ckc?MKÜnЇ[3g2_tzzf0%TpIMf+c`%4C=4E5`=Oh=Єɧ0IUd`5&vË́b8}^aHLgAOD0F)nG_ng3Neu~ ,y55J_P+4EF 8\ԃP+0Lh_׸> ؃8]O{!0)%ŋ@6oH9^>{/o!v\ix` X+K!<+LF,q;,qɽEX܌ q`&aq[fe Nz&f"z3 *f]8u+׉2Ԥais8/&xƳ'x}Ж櫚Yxd؇K3'x E0{b; vB=(').duE)p,tj|ko{r^3Z; %&Г3W"T6.U2pYCIGE6 T: ȸ$m,PR)!jTrT;Oo_@H-k!b FC4Ni7]:-J O#FqA33RiP̅ULFxp (o)ŹI0&fH4;})&u{} N;b+]envV{},J#Q[x/ckVh/E#k,`,dF-|RJ}+$ oQR=O 埕y> WJRNs%NW DRIRD5aA'^Y6bL&Țrr`1_6Viӈ:*hP0.QZA p}<>! v4D V0^N*UMS¿!:/CA#oИb \jY,^~܂%KN-1-t/-ww?߽y7_7 }gq} LR4/܇E%A%~i4sB)^}YjD5]2i\yG!E0{'9)`v%7GqCBRjگ3JCteRM^1FOW[FU*zqyƅati7c>yPrs˜b+fqVx -*\H];}x|ML_'1 p?|.q% =It=?c0g~eI#P<񧜭|G`zd \8$D UfI4_*s}ޭ<3f5v=DDZ'.9]^ m̼|1Dg&(8 sdFxqme:ۜѓpFe޷ V3sT˧0dCZ:( x Dq)Ol,U혏rIRȼDDt:VQ'8S CADT,/Y'9סh[xڎrw:r{>\49dX=Ig;|Vva/#ۆX!mco׬Gn+ńDGݏF`&cZ),3,%rݞݍpf2~r(:)tBVE$܈h=gI{+,'&+@JgnLf>0xݱ-G;Yߢ$OŧwXV Z"u&C$Cm7j5w*w<ܼQ(,%H(?-`"lւsAI&≳Ix痃Akt>F;Mx)4CMT|K7lT4_+_:#L"bB^^/iWDZe#J =J}i4"#e,QZi> c$֗C%fTv{׸ sXk \znDIY(chTqyC*aQz4BLط=d|6ԗA~A)]@Р:<J!2/JZnې>r%ZR\/ v#2"b@5~@M #ŭ]l`$f ] FUQ t f؁'ޘvۚ:"!GիKwib/l\}'OqR1@rV9pW BP.8rCYVO-ʨa/h W_h2ÇzHt8VWut<MUV&U5: {<"a(nkFoĵ`Jcht]ks*Dh#ͷ(!m࿍ɵV7P#05JH`'sHf ǞXFixOGtWoLRGVòhq Oh-'=ʣ,齀6 ȡV?v#AAy4W|Ԡ4IBh·jH%T+Ræwct6kYvE}=h $#Dfh^*<ڱ:V޵[G:So]"F+]48|BKB$ruptꥌ{5g7{KNsL i?p4Ve-OgUʫ/p(#7ɱ&ГxVɑ5JPC4)PD^s|pweTBԔS2p4-W \s?R+@aқ%jһU.h: ԇYRR ȆOa1}O˘/Q8}u|Ƚ7drӥTD l=eqn=Fu"-ߋu ҫ ߇s^x%1D. TnX7' XfN~:0.uSOqQo)9ׇ@] ^('[B+\%h~'Tme<5&{e\^ɦE+:ط\ ="m]˦ 89h~,et/ 4HȋG`B$C?Zd|>P: z3(Nw)}N e؀IF('c9d>B+aTN1U9$B4P^/4t`TV:0O}]gw5oB0TteXLmfިLz֢r1+Ҏ=:`*q5{ 6A4~aś赆ٶ['=9 +nM|n6y⇨*3VҖ߸$ZdGE;[M4#B1MNZSQN`9{ `8 /.l=\^\2dz +FHe8'+ ʅD:&j-BAd ZJ14<>]ٺGcee,̕E^YT|&D`,GG((tHZF])w:\`>E(z|s6HYcl@O~YFTPTL=EҌLv9> Y'|ތ&ЗJa_諟tIޜ]YHuɏmk5a$! gdbdRNj#~ўp@S\B Ʃ(O0AE ]2^vzh= #}E-V[J>YD,r^.a :8 vVIϨ6906927_ܜ~YoN3tqFksP$9CB1E k1`TԂre1CݨCṕzˡDIʎflK9D !e#"0څz"% ms 'QLá,B*J#\2Ba 0h t2^k5mOYP=^0ģ~L+D*M;}h%J ,WCJ#"ZD J#б .nsҰ#Da1 Gġ2BE%H2,EF,(h'GJ';}ULLK*d[ &ֳ?˹KpGϗSn?]A-uoUIs1Ȼ`?}k_ΗnmDn;|ѧbInnЧwJg|;;zzU]??3`ΘVR>_#)(:_X?-k2DB#D_EU mcB{ս@ hF Ae&cr##Rc}Ja)f c3i0ۅq~ ߌ4Az:C5clni r TbFL+?~|02Y4N?<(5y |(V׌p"s&:]P^|py+c >fvy3%kIk?t$mP.u޿+3 䟴gwߑ6(0z{u7ʑH9J%A@r0NV`p7Czij<6<]=s{x\2['̲!Ab$ѕHhkJ)5qِ?\Ogz\Wo/.XҞ@"PUxIHv)P e[s O(*A=04HfQڢP|0joA%l-mgGKccǒ5j e_ʵbFhã_q>$"V~|bG v.1^,]oFg?ݭñ ODw`?:<7(:hYcP H zv?>WzrQb ں2_̗}kTPNfw5GP{3ʙꏛ=:7Vz}<1Y}r@fGG(0Tx}̅?Yt=aP(ҷ?T_4Xdiҋˤ?o:qf\;fbl3@Z9 2\Cc h~dz . AS3\CVQ\өȗRxK!D %lS26e vDJAS(S6jѻ!I侮!FN!i-1-&+w&I~IkٲF(0G{C.Hs%ZsDQ #P"{PkP@5gFBH2@i M=w4~O 5PSB9o<*/rpbMT=Hx6 /5$J nZg8 І $:c>e} 0Och0[ͳ>kVh}V3 B kS@)k1#(HTc؄JD[ҌD6|ϨO\AD#0Cus2u@Wx#!hH \Z=]nT<*GDѾsO7CH-75Qa(Ur6cxY1]\?Z C\NfhV?]5Fj|U2twW6] V8L B$z|AM"YZ2$M? Q dhfMQW_G<  0eujJgn3oS5м)@5,r&?kyrդ 6O`p)[c-kL Մ;2B>h~j]k_ٱ&۳er.Y%9309z/ijwg3u8~p{| kkzzr96/q:(y=EW??3@3;=Fzi[pOzo!j\qW΢x*ڍ+.AA ݣv{3T[냖j.8+g6<)ykQ#AGC +p+$=!#lKaaY@JIɶ*R38!ĖxDh!1nT"mS`X% y6T&y qp^EFS)]8th c`e3/JZJ$ǝI`r3%(E\TNޠhBDpoqxLLJׂxBBJP\3-ey|%+?lyֶ N85׵IJo&6A$DS?s3%jpc#&r:)5*B=9rS(/KZN[w|q/ۃSq;]6f#QNxNCb4ynPH07G4D)DQ tfb>YƷF(ixzBgw ZcZMNwާ)n$Os**צCbٻ8n%Wz v=x)~}ټl 4ɦV9' }əĹiþqؒFݬb]hFpJ'h}po)Ntb!>|*|n^ͩ>V2)c@3Gtm'2ILrB7M۬ZrWX/"L@LS7LF Ф _R@pGk+7p8 kUwޥ<ȹP,}G;j3Yw>PwhLz6:u+A蔾uϛ .\օ|"Z$SIzM<|RN;X1O|bBs[LaLT`ad]K ZcHPCM"NZ-3h\"77ʰhxlL>#"Βˋῖ(b4s9q뀙6=Z8oȲT1 *^$"p06 %ȮG0Ne0>"N-ʮ ,q4JAV0)oן Sx!pcA'zloo깢}XmٜPH"wQ. 8ӿM.of}Pye޸&rupo/3?5px$_St|_aW |Y09сzo& n}߽LMIkB C.MV?1Tn;{G;ߟ Vu+s|_dhd393T@c[ Aӓ(l? ·.y/#[ zcGR$Nm̵Qϱ,pWT;iOf})2ƳHfHDq5^6Єm3p3TÂ+lURplm gy4N1B2x/ظX-pa"2ֻ( pZJ<i8zcgľ #k,7(.gj}->^㡙Sd"Yl%ltt{}xs93I:zn닥cf%*o֜)|gŇy5Zp_a|桨L0p~c8n%íz 㓳z?hw%WE8(UDlytRF_Hfl$lB":4SHIp]KPch$l$9;t1wN I֒nx /AO!B{g `dnlbjpuҒ-Ni=<NЈ;?~߲<) )a'[[Os۫+ı5i0A*5H!AJ-qv.yNDDwlt7Ϥ7%Y@xK0 k8h,To}}V-I*HxLX16^ebҰʎ.Gr~}|Nmpt}>GAҴM78"L䄐c<H]] %z2RM\J4 "}rN&sׂD*WaO_V>@c?—OyE"E͋T4A $KiygECFP+t@s* h8u>+Z/:,kZe\5bz+~tdvNj^gDK2rZFQPPK浕sD(gTE 0A4 w:Ky+oX@y_v*jfVs"*Up#T7ؘYK2$o2NDT5Dx ) 飋D X>,4*7ؘY7I1&M^6jń6j-X ňef3:%Mu$"j6HL]ύ.#5 YȤA 8t!J\Vyɥ ܁+{^|U[^71., mE@oWQx$j!(EѝZ&S:,(X&(S%cE59yEl^o7S%—13q56PNֳ%e=un͍LO`D:b/mƊo̯$Lzՙނ*zt]NV,}躄*=w4s j:lMZi'rJևt-PFqovt HR 8)o7425܄%){Ӄ[hV DS4<_8ak8{?xuz3Q!tå{c$C;w]ce-w^oCo) ]_!KzcjR,ćZ#)\2{NKz_F^?lZ.Uʃh3`<1M+=PЅ|"Z"S69ҝzú1"bJ)rDmۈޗ`ioVLiݺo\Dd =M&sL߭Yc;Zn#"dÍ+:o\Dd*3|f:z~m&RvD * B9m(![G4Q IZr_bt*߈dj$Ij$ϭ"!IiɁI؄$= }ZRܩxp"F][J$ϋ[m$& /u (EPiW4إ3ʥmCk{BĻj;qgǫYncQ/|ք+O=5X[ԙ81?!E:uj ` g@)ϞmX$1{{yiWDaew߀AzvՋG]dEQ=du0 ZIxK뽭gC[w9HV۟<"u8>٥q{yͧ6{}Ǯʌ4F99a}Պ&6bGR;*}G\#jP,9ݑ|Յ|"zLiM>3+E8wxەfq`t,i~Yia{3ld xn68]S.u4kKA}]L6NCt҃4-YN{1׼b'4 %l5,n}o@st(]hD' m(:+ֆ29ͦ:l|Jvw6Hs^Mk"*h^#CeGP},DwkyY-p·wg$?$ "I~?+`IX|zp/6*&'ǡvqwxuxڕLxHBH=hnQϖKd vOLsoY61IfΧ]7hiIz W\_%(8!~Ό!Np>ֱ sZÑ7^jji< h+>|V-2Q|%F\\xue?f/fz۾/VV<8mu_ݿFu'6Em Ph[ )B+c ([1ҷ@۱ ;EO; | 3`~E* 2Ib9#a${]vt8. _C0|[aJٻIvVzjgEkw5kec1oo_NY,`-~<'ҳļ~^K%s_26F:1=쟟׍bi nxG<rhgj*͵6.)2ډZϥ ƣnairKFsUzwsM aSzP^G@7.d SB|y,'(roPv˾Dik 5&s4ẁuIBK*xpNnIRյނvVU3FLځmɞ"1Zpb+"!ߢ"-6Q$h <,YдhxbojbG~ OW0 x/7o1-՜V9Tr8aj;vn MG漽ٗ=}{gAx *v,3o >@OիXI X6~}0iwۃEԫHT7O+QIw/ i6!cLz_h@ʓ$RM,GDyG=mdd2yH*/+x6hsA|c_8Ec_8E^>_VQ!4WT;=GNPb4!D #Jb0jW'))a%s6}X?YnAK$BNO1c2da]N1V \*H2np+!>؀dK#6F%d]ˮ?%znMK}H2=VؿrGZ*}s2/~~ 1^ 9|瘋t| Nr,]!` =gڇ6xi`9on $vbi:r1M;5;-MJŒ^jFN7.8vQ Ӵ\j^粣Hnp;F~CA*gzǠߍz0 _I.n~7`iP"?_?Vqa gT wQA|ߡP{=bHG6'A0*,3b@.A`UagTy;:?'Hk|~^}@UReC|%p ?&:rv}WJ΀v_1Ŷ}^;/*}l 9[[Mw2qoZ 1RfG=Pбmm̻@#Z%(sh5Cx"G{~d-%$Y n9y̲H>4}K(€p'|c-kdo |^)[0qc^t.3ͽ_ҿ내0{22v1g"grt3y,ߧNtԾf[!ٴ}.K)5q%WМLl؅+j>Φӓܴ"㪴ϞizqvSDd^|Cgh֚P} 3Ѫӏ4dYGW':mni?NϲZ*E#\Vu"H`+.eWһ+,#,!Znj*a׺~=C(%l?1d1Y\4sctyۦO2ԫ?iw~՘=yt5ai%~}Px^c{cڠȋZ߉z%N"RؐlvxK[Zṥ: H&ijʊDZǒMI~=3 V fB'sQi2dpj6]8PHNgS(5[ p0)׍d;T1:u6pS"j,Gel\Bl7iJUXJ7ϦL:&9 d&xb^ɔAH:CY}[Y0IR>cZ ^(b植W^hѡ@Pq.Hs)peF! 3O)? 3 P&3[~Y ;rʯ=-@E`1& /&p2Gd5%rvK "VɴD( /CČAF ,'X? ǭZ[ Lkb:bfbF \@PRhL'dRF$+NP0#1˜;@uĤ Y\<0{"3 j0%/RG,~eT&͢3Rfʋ%}CHMfGK#ȮEOZ _Z,Ys2Dǂp#44>4HL;[gFCAI=L, ҆ k-gB^r:rܯc }m1E]lNZiTjf5DBЦnlɲ7xu]F/VFlL1EnGvӘkKH:)Rav0p{!_oS`4yȭ4 d\DJ5D[Ɔ_C|h+ɵcrPl9My'hVt5nw˘~_^&D2Ǣa,+&k=LT!T|n{ȣbNG<7ֶl 2 *.~ac.}HG4 U$?ϟmMGj W|8л";wJI!FdbL*+#vu/Q$׮h7 y> eJ%HCF 9R2f -U3ݪڋ  L=9G=G;P0QěrlԁNmB`վۻ-sRsRsRsҖ%>>QHV+K9p)FrK'wH{\%wnjoS艼}põ]O[^z$˭CMJkVwMx`~hb'ZI-$y⑬3dyoUh__gLk8C%c!r"[! \eڗ^1*6NHZHjXQuT7MԅTZBKs/{*/SϮo_^+(]XCע"5",m`9;"F9LFxJDDib+0N" 0e챁0 'n ק%cK0D}97ܳ2$ ryx@],=ʹg2EzjQcԥs8)H[{L1%ңN:lTLJ0f/ iX"mܝ⤸*N8i$lSp{˱t~y;:|rmdO~8}vy l?vɧsXd[^i |A^NM*U?8R88X yn|߽_iUoK1g!X\ѐKt%2LA,=ak^|'$mk/ɾBH>ݲM(;5 n1PS}i:ڂn0ں\Z,x,?]놧S- 9ٜ#"k P5|-KP(^$j'J,r)RZT'X2>QVh NVrmkI"3gzّ_1/KvOIvqI^`4H{Cb%n6)Ϝ=hD~UbU.`ʑ$b{'( 1oFrkOi;6!B9w ub[8t".s }s.~oMF"{dv0ޜY"u0}\3aT{QܛE|ٕ ?_߇? w_T~#=9$&Yi0EL Ü?IÖRɋ$_{2攪TīMxe=ph4pFr_=/^؛EGytZ|ۥ?*De>l/b XCmer0 I@${Hc/Oېt\6$ؐfC92<$]t IH "w~9%gw#톲Mwhoa]܅O˽U<%_PXxB#bXu.Fp73e"ra4IuTP\y` C?6E80VCs}SwY+S3;xsٵ E}qm5tazO,&}B7be/?Im/avߖ!>}oԛBL^ .pm|H [ًW"PiKtY9TJk%%VG~vo&9sY5|Gtd'z7MNNI%a ڮ)V5nvMj 6Ra6|=BW>m2׈Q!ji6JSڛʐQȐcI(2$U#"$U5 Xy/h~"?rR{dլ/M^7r;+g K$%!6xMT*ZAh+{I{+ߓ~ݜ%9;DjSupž]F;m¦% Yj9* ޅ8F` mz6_yR VJrԅLݒWblco=lŨ [W@z`tdkÞ7Z?Dq#㛌WpL |>. 2qy}KP\ǫ )cC7coԡex2s9OPhB~8WU]g]D\ g}]¹tqJ8pFUu?,ίJGTfvq2iV^[] HR5&H(͡#n Z~. ?9 HVb y7w7+fxau{_@($W̿MIP'+g'K"~-H~{.C{U9?Bb`sZK[h_뵸AHAA M>1/YuwzPx Lj@@#F0ohl#p#s0}CUMI=6ubf.Rq.z9Vx7Y; "~D*USěYZbIZ{R+ZY0B ep-RQ"bM]()5ul܃ÐR2Zed3FƤ(Մ`!+9 &0,VJ!ITk?P L)2R[4R¬c6]N 5av|׸W68~ 1d} I0#>  [SM:#֫E!4FeX5׻߇'A\جr%!oI9D|:7FC=)!c?Y026F^0TM4ȫU-`P,qB)UEt 整Vc^Ye*l*Ui:5 &Yu7>Ul%\vp4ܣ nn0华GAjBQA((.yT<όO:LvZ(+ˁw{ 1$1E#kqHܜx\$kI. -Prj"bTNEML2<{HW \l=O2UkpyEy4QLe.1))$-rqQDr@)9|Vwm^e(w~C܊lYE DګIo.RJ[z2'c`13 $nM[ܺC돌`2[:pq)$ 8{1i]HkRBj,8#c~6 8% ^JwDBj rgv5&P:YiNco?kM9,ؼbQLԎX@(q2VkCM68;hqYKAX\1.W .8JϘLg,5"n"<A%0l( XcfS)G.Cfɡ/0U3;uyIi$AG]pޫbx1~v@9`ȓSX7?<둿o+'pju`uIeʶz`K&q#'idz?6)p?X/uԋU9}#l.S2%sE`#4>^ݘqp(Üc_PMr,ydMĵ0 ϰI!o}vKu~H1V.0幅. RI7OA$Cm!TxARVM: .t\lDb s⪁H37|`MB%89 ZII=DXe=^ <(g>B3gf^3u<{E%Ep?YPxJoqoƎ=cv߲ CJAyJ|Ƥ fx6J+ $ȍ"kF 78jY-]E_g`@Eپ'cbE$Yu f e⦲;1yV n}I(`xʶ#ڇe|JCrs F'2)C03Gߢo!y9r$F^B]qfX2w7绖wEI@}2O)H,؛(J5*cJ4 F ;2w-"fis1Jms vD7Y:MX@`XM8]cBs$@Z9\3,VЁ## WY큗3N Ih)3aLa,8K#cb|M+WjLN}Ar&D[6agn"bǯ~2$\z<z. Z.`IUkbzpZ܄0Dݬifzq4 ?{p*a\ycws<__0gl&}8,n<~8p{ {!:~2y9O ^ oifR:)FxfU=ܴZ9ʬ 3M1+% ]8$)K=[[ݴ}f/b -6XؠMuYj^+@t@ ic?aƎ}9[haXLH1gs7~!ᩂ;?[YUKQCfVmwX_O)S˹O]5RLל HPT1W9q\]XzƖ2_W1>d 1c> ẋ}2SOC4v}R]ӆ_4m("G19e=^p%OXQ]VQd*Ɛ@U9. 8Ίvj6Xn*Kwm[0y?bM&^۽EʏJrwg(ɦdǦHQvbE"Q$ssfxκڛ @k+O0ְh}'3t{eVl)2{?y +ͮ[+ydmכ! ›YeuRt)\?v4Dd0Tvq fQ]k՘/n|%떙M/Q2kVHX9s~ZSj|DJhE<ЈI;Jk2[Azugz悰pcD@Vft}c_-7/w&G2DJGSj]gqYYF ԥF9ёc\iIˌ -MLm0JĪB- G Èu-Яqd]/u8qh0f)6Iʭ-n{'N{? ?~9Z e6\92k /BLǸtk$w?D齤n'65ze#xpGSטeWӅ?*,¦y~y=4i#7@Hn|m{gQɰ˽1F0j79- &3OfiӯI擌BPHI䊍ëi/5A_R mb-A.5VR76%|J(5P!=`%CKW}@@0Xܐ}`K))67RixRj0KKx.-ј2,d]aՓߴ3xEO^(+/5gAVRSNo{Oʵ*] y 8ռ2 CR?wdH;Cu1/Fj\"fSCkgGVF 4'(\dj-2Ѯyѵ!}BFrѸN(sZfgq\Ɉ$MdrG]1p9ƣ͢dDa* Tv\B{3~co KKQ80lxKխz0 W$t܃aw~&Pe&>qb'sp~ q'ROҠin?}.fl2ʜX8޳s Q)ܛȨa$#YRP Z3"F G.+ME_݉ܛuMbN݁: tNGCaOak^]ƕ"fմjkjLخU#T`јB6<T@ R^xP56R3#a*q^*Re`΂G\ R#yEJ2ޭ**#7 4]jXңOD$1kb"Z,1 )`Y^j0K zP`h.cPq9ʰ(T y$!r%JQ-Bp  N)q^́y5.JHH^{5 ?i$]M˖&}_3& 6ɣ^`>cLPIen\Of/r3#9Oj-FpArNR* n2`lY3P;CLI FF S|Q4:<9 .y󶷽5k{f{z[ś7ܟ^`n7޳ۯ&*2>텿\G:~{/Ysq`fk^5d/ny~&4&tuz3Sv}c ~Tn,!XÓsl;Ct O9%mfmw=_+^%^cpn+~POWe/{6=フO@($EH:ggⷋ-6hO{xa#OtiU<-;;yɄ3=K~;G01[ɧvF &ZÛ4? { Ө$0Ywrt [[Cuc_7 S~sfOriAnׄiS2ip3NeBQ\<~;t|"]DC8D?2X<02IP"-#-g=%huOYch몄CG=BBcNy%L$H*J⑌>ZQK IJ/ERDk Lɫm#'r (5Ult~)| Rm {6wS5Mxq˾ JO7hjЋi$CVX $i8x5,YIDT^\6DMĞZF  meUK8ijX3F4ic ې1h!y!y!B%7,řY3gf8^Vf0…Kx>Ƞ GDIWjhDRiYZzmD yzZ#Rq^*+G.暸5Buzd4Z& èCI/LPM\S݁j^C:$HǬz/@TW,>dY) >ڨl[fОKd0'Ah `Ay 4.9PLd+;F"CA$GYAD Y_Cp˔-Q҈ɬA:`n~`jcM)ǔ"ɛ)6EҦHI"Y]JxzcLIZFJNZr%|*?.>ejI)¢0 Ahdj V;ŭtTAxM}T)$@ U۵B(B(*h@LXP% #ο;|]) eQ=|WꋑT}/7&R^jKm{Ŷ&ƙN0o.t Y!6(cP$cTke1J {q IW9o _(Yf;rVh`_ٺ8]rory)Cv겔`+)_+%~ֈ 6\J4N4f+h`6$PFN>vқi YnK67/6/fl5jll`RY4א;TI(`&'Bֺ^q*^b_-obLNC1< O'u޽{q_ JL MeS-LSRB 婻3hm?#Yj?#YHmQF,^A Y/bJhQ[Ľ倖럆` u4aXqF[3B85oVx|:yyC({qӉ,56:3ZzӉյx\U*5 aup$yOO[/rkRoy>_V5cACʀow [#fF`Acqtԡ\' x}>:>vYg | Us`|wD*3vj=_*,|kL>[L-S <hj=pShj *qZր4&gVeSws\ƴzt=¸N}Wt}\ C =)LI`{A6"*xuD \uF-(K2)/͉Hက|mӆQȀKٛ׷;~gFʬku=[(*:m>*cNߏz|6|k 4t׾Ӏ[xTQFӆN3} ߪexe5X, B<:n2YèiSn̈́|7Ur)-_:%yEJz+Z!uxZ< C`.<-S>e h@e*-׵L9Qz~Nj5^-SrvՀV^8-S>U vǀgUVU.d hoѸdV]@573Y}rrw^H^F@NV P|9C>i냫k|\of3;Ȥl6F9^/~}ӵJ/cW_=Cٵ9v Ԕ[TF2 BYK)wt2dF-gh95>5~M _SSSSS5DZ݇xuC!2-+GZ=䵲l}{6$%=Z[c5 'BJz"h%AbT I>V+&hH7@9?/`@]aogMvb!j-_r@NBhdZ5Z̽ N怞 H^hR"rdX<>dU=\$/ ImEI&ẍ́4:)h}ѯmk 4!cc3!!G\(s#U!4-A騻J(SRk0.p/M*dSQPG%ԫzHf:Kl䢙cFhRU )Mu4UeږbժV."90Gَ JXa!Z/9 @$\GZEJXʑu#~F̭P(x'0.y /$NLT6'J[JT/Oi l^r̭@rEi hD0@}K+/ɹU= I$U P %Pz#!BF%uMpqah4iuPԫzHN~k3xDBS}d)c̋xӐ>BT)88R3DÅ!z" fS;ovhdhzAG4^NAJtI RLJ@}꩗ޣdʍER(˜DcHf)9ݒ+[<Өj8E.|%PRD\$fH[d>qr2()(b"P8Z%XzVR %UCG1_%(>HJuP (Tr(d0j<*bWĦAQoqP/60 2gCedZZ{N"d6'(v(JŎtʺN^H81Y 2ʘ2$HGFA )5@&l4r@JZ'K>@XZ¤•8%xA,Krm|._Z>i)M)_SL9H5*7-@_xDBC Y$؋LRoL7T$r.{`ʁ;eU(U̚IQWVw-ڑ)7|'Z\Yqcqu(. s YJ , k,-YR 6|֢NA)5T:#f}RxC?K YJ /ѕևnɤj,}kJgQfF1R|rD)ii`9uK Awp?r@SʥjwZf( N%%(1T`3^5xBOh9t65PūL>C& } u(yj09RY OMcH*O'Ch#M!9<.e?(Byֶ#/uKPyqL75B|q+y9Ͽ^5"[~ 1{>umW6Wzvi쯤_/V1ͬe8{*?xݤ };?r>XZ+p;NMߌU_y~Q/~JeOfO 0jD3 <ޒ<}ϊ#tr%˜Ma1v56C?]rIGפKmFb;yw}wvWWY=n]%Eos#8\̯LAb]2I(*[z07$4$d@|ӂru=Z{˳ZǖO^\^hޤu.8flfwVC_؞t a+s xx1yywBJUs'D[}h +9 o)/o9~hWְұ&۳DOu\oW՚Ïݾ/g]y醾L{_ӏ+c+c+c+۶z5m=VHhbs7$ϧ+~.8W ->)&.¯45\g]5~u揌kb#eb 0=^JqbF֥],&?~ 6.`PH{)X/VxB YF& |`*JS֝:i׳Ȥ:5!N$h <3;Vp1 7J2M$X2*)n*wghj!]HIƫ$;QhJuSq,i\H0Us"BY\!1OǶ"yV@N$.VCmclp[nh"ddJҜJ-r,e:LA٦/X Oz^!zlˤm;dh"Aozˎٹd/{) Z7z"MQzX5`9$؇K1nu$yIpUVoKʈV >X 䦙 zX[g~˞g$ߤ3Rqz]/O°uTky8e8e%1Ƕ9Ybd!B Dml[RC) 7sR "hD6A|!ӯ30E{!s(x+k{Alͼ]I)k7(͍cC?T`s+M.&:ѐ(#@o$\Ft5-6fJkJÔ?PW*z@+#ʦ!~iuskYQdse`6LK&P֒ߝtڍ m)N݈LQmffKe ZqHq (t :ɴb <\&nH%X*F3 %TƠ/1[3Ub©V3|7vKJ]#` `"akJ!':ɨPQ陝A@棌 (Q) '޶qoR#e Z1`ቨA%_P:A_B~i?)z-yTV:5dr<H)?mg^,#hK2Ed5"7- j e;JG/H*AXp.rja]M_ ^ͼm*,so)Qck\DBǐmT]j!yiq`MBq,ޓ6n$WyoK|2fͱ6@6elˑdWM6u7CԈ$3+iuc-dQCǒc7*ʎ]S\]G-LU4t.9"=0 %iclN~(<}H/f h99fT0 *l7cCFUV=aRdG`rRP VBJS "֭^T,90JE"%canVLBB -Fy'*QK<{m=y!$_0O]޸C_O!M&>?>y8I㛊+4 =~Y'! ()N':z­ CKv3Wo~;41&OSVG84{O16.Jeݭ]4#DvZF^~!2bKfy2e5˨Y-k>4 D`L_# Dk $:D$q ܥO6 f1# ]my]} 'oŷkuޭY/?צ=y˜e?®[S 3&n;ygJ5O8ԙ U,j Om>7#ߌ3ͨ߬XI8J|%HLԤ 猒~ r-d\h[_T͚9^qSzkϚmBk%o,+F v%#WP42<o^ALOGmkʍ] ,:IR-K EBXH*kHl(" 2j%Mc(Eo`),a_]cn&0cIB2IT-rf*+x)S*bX jїB>lm뱼nK8^8TD4 8w \ݼb 9Ĕ][6:b a>k&BL ͌4BAɍZ2L4MPGsiT.# Ii4[L\3S$I4Y1!ZئF{^Il-wBefʠigRM*Rjj8 _Upȣ =;&%v/ӣJWٞ-P0>PLD&qy 5 ]u$f$zVQُ!ڡlcwCvW |i:R`Q%F᱇EhE˫FݶC?^ZCJ `qh>H7yr> a[ o5$Q$>cwC)!Y<ۖU,Ql0ɆپxvOde9pjxSW~ev=zRVQj .Ӕ6_?G-١<:yt ~1C&9ˣuy(Exzk4"K1Ϣ E=Hl!xso g!Ҵ=y"M("1yo ȴtaVZg<Yܮ%]•nT.(ܮnQ@z]_Aݽ=RTzK3"k/-5t0ƊAD!Pa $x/p ѬS8 dQ7jh2 ",7o n{+\v%0'R4P9xl | {8ɉZZhZ\QӘ 1.Ǿk&L*䐠&V>=Q|^L,&>\-*7׌V|Sv^{"Pp_٤8>B7\xKGR d?d}0[Q~ 'ǧQ(yQB(J[Œ5ž**`]B1!ݻE}z]oEFoba-7"F>+aWyWub!WFL'yezKoQLm:TpNSFsW[ DM2j +K>VidS_s*Z|>~VEPǩ2sLZɢ>̅RgoؐER8|W/4KR -dNGYIAbx6BtN!o:Hڤb 31Z'}d @;q+ڂ.>vF7FJ!fJ]%HF!M]?ͣȱrBס*_..H#n:^o@j0Rק:UxDlrK'>Y׉6E`Ŝ%Ĩ]V W#^5*1}qP_iE8+j%|b9b䜈"[C/OʓIi;Ή$bRuEp X:DŧȱD mN4wv! h5ωځ#i9xy3 4-l76d7&7-ISb5,l&a@Izid #Bb>3ƚ4H dS}cZ+z\hV H( .GWwׯ~H؏W jn܍m`)lJu/lƿE_pl$57XVqjqGb7\DYNfT0Ǻ̄{]H*"kRXs{Q j]%/3Sr ?`Pu0 U[͌k`\Fφ୧ dzzb.t;ҋ \Kk.i{p rBUKm%~DRp͵P 4?֧.ҜOU *Zd"[%g.KC) BCLG%8vY]OzG/΄vE0gB Ρ>؝T<6WTQQ%0Du&|:]CY oSt "ˁۙTbw.<Dt3+*΄Sfd;ΕAIw>V̆ ':љLPaagM*0srJN=+ܤ>aI8e#Ggj@g6i}]4<3*km/ҪָV9heg=T.`r7N܌xks3K|S2,h9X>9.ic*=Ij}H!1O\ph= !oX[||^K!lhPđ$m:)Ɨ-~ О1*CvB;z O`ƘRH2/vnwT9$kC6N/ *[ry4Ş&[Ͷ8+@cө^:ca&0Ls!;G4JK`X[fB=Y;4%5q=ǵ S!ѦHӅOҙOI.ucTLZC5{guV\:PCq[2Ou(R H;-O)UKQ\J%uM.w4Vq/Eu[Tl0M~5I)̤l׍jn|AOƎq턮B!)7)pPLEKbol 1@K[7|rX=8ѻZ ᾶ6Ւ7 9 iԶ')j}.NVq*' A[OYQ~jX3Yh'xW@1ݖ}RpA)B!Znۓ^VDo8DP"E=gusH}Ņ}z|hd{4Yݮ@6vܕ.#Si\c"Ap(P>e0]z\(峺R=r*%.]&Ygb@~'ЏwMBQ}p-%HfTԲɓ_,0GSi.OeF%! .O=AS?7Ȅ,PbpsFq/C/Ia١Hkg"d =Ek16qAQx͊QPMS  ,*Į'w(Q{u4 Ww2Vs)77^\Xh]Y HݐrZ%3̦Wğ}PDPBsQaj͊VXȫX`+7r6a0džyW;L_ړځ[ZΉ6uV9}H]j+-,J7 hzQ|.$Bnr0ŎP7=AunW T\`C |6Bas7@imغB~H RC)ek=X&pgظ(NU£!F+RvA]|x#A4Hķ}⑰pGV'06\%8X /jUNT &[(Ām/D-bƒ) Dc-p |k;KHX; Ոtg^owU[1Xqy¦3EG<:\h:s Ff5EyԚ|q!ي pc;ZUv}ҢNG^O0+H@[C@3&jǺRJ;[Z%rH?M6Ik}n8(H2kчOےdHU(fH|gbydSLʢMtM5Ipv!{ùBo r)^:@Gqљ٥%Xי6mG!!@7>؉ZJi.ѻ>N &J1s ^ SWr2ȟ@,;Ny-C=0'M)]sP &4erE6hU f{ o2lbɆ6!j{B# Onq|P! My:c~ CNGaGi╣45UYgsNP<03%܁?~ Q"Y"]chz <[M ӟѡ\QghӨ=ćEO Lõ?]'L4b|MY$-,ϗuqNa8Ņ#rՓdѴs 8}2sjPR 9plLH01V z,sHa~ec"M~_|3  (}g>._ߛR9ZBV?L&\? ɋ?IwbxɘT35mM ~]k Vwo_~/x4FISN+0~Թm5I"c68Y4Wc;-ØN>9>AG|9'A7W+ Ǎn\o/>?'a8^0ـj %Q0 {@\qÿj{B J r U(9kpt0CsN\Y旡Mmj"I?rk1>IsA滽OA OޛjRmdr7߾o7/Co 77%`h|#w8޸dAb .oS2躟D$}"2 ҧ~xqKKyh:s/*!+ :{+h(= ]j膪 K ܜegһp֛ahk]npP D`F|`ΒgvfC念6RMlX@>%Vn\ϊW7/O^WzGw 5`R0A7׷3KM}E #PvtJyr _heԫ Bw_A•2(K > =>^н%'ooK7_f/4Z!v3oaƞ_xIL*6/-.؎y7:8hp|nkITL߼\!Lo 'v?]DH2B!>iGXI I:11X(|Aշ{U(E0@x~cd*F ?w @p&DBCPKנqPfp5Wn &SaS-ʭնN2L+]YeeZtϊp )@S;ez2 Փf@Q>\+x؂y3o9 E-u[l:j@C2ؚ썰ӨHC|%(&g/hJk+4N45}@; )^{}xBA8ra(Dܹ,OU_䝞_;z])aQDwW֕+CeGٹ#Gk^G׸# eĸ#J-!z,vWO{.sރǟ&N#YŘ\ߙdm_c%==O혳kU¿TTUrZcTV[W?uN(;Y8xMD.tIo09.$&#;|d(_t "7F,ǢPxCnfěwV$j~YVQUaRՓ :9pJ- )ơF;^.syJgCJWTS!vM;'PfbN`;9QbƒV.S:唃jڛP~ȂJ lle{̶8hE@.Dj*'JVzX" kwZkGNQm=gS|jW^-5\iiSǝ>TZ8SrnC|s[YAOx(ݷTWh^Or#0(nev*ncS׻%BNN<ՕNނ:ET*]/q٢,lo.E yRcpS-ֶHv2ݍ?WSD-{,R4bMA\Xݼ)vP/i%Ki>H$f)?m9J sd\JL,E^)ŊQ TZj@-RbKD}MX|QnhX#^KEzrz@\HZlhCFM§7z92Y B o/e ~?\1-|h:LV^{{7N>I/{a9=+$usLEl9 %eS›t)ŷT~}nTԿS/x[OCnJN3G?hVqپ8 r";͍Šc\ LL߿w57нCsk|y\B!7 T1WP z])ia-ь1#ac(l,P)g?DְY?7.@!LJ( N=sa%q:h.$2CZ86c? i*FXn h ץZdbNzOgFe>.TA48Cj{GfZ~>?_"JI2v+M6ιͱsns;;ykʤm"GhTk6B#/A5&S R6AF I[ssn~~?Kȧ̧3&t%,RBx*\i庂Af|F|x(PQG6Q'6ep'wo^] oF+pl<aiim8ImáL)k3$%QeQ$@-f3s^s8V>hL圢z>I\Zr<rꍺpһl#TMd#nb  _{ LOsmT5UJ* B*YDq`ˆX|MdG8PȭS PToq=XjReκsat% kM ҄"FFp Ca+i`(B<$C(fQ5=[>q^& UP9F)0aabxL)=g"=Ld/wLWTA"IS-Ci_Ya"3Y6 HUTh̨c5^[?}8/cȩRyD}$Q H a DUP!H+{6EGLuĪZ*oqj `jE=\+FˀU5[ ,N(WUK R%Ġ].( @HK# EZD^E`P+À j~|N=FU LFppb5`sxE _u2UFPrhUFȓ,B#4oǙ[d0.pQ70m-CE`K3bQ1\t 6[=B]!m,E dFOe+.bۃޯY!s)` ΤF +G`ũR59e߂8JCK: [040CHUQNNQ;;{>VPDaS_E\_.~{**9% b0y>O&ZDaN5O|u=j Ame BupqM &[p*'ڇS2B8)"RwP?6Ngjy5Bȇ#ܸ+X4bs$L?/ p֛{MF675$9]^]. q0Fn;=D&HgKVuoo0~<C&?uYwj\BO)o܉o ?)N/~OH3R[PDΣPu4pa:`v1f6*;&QGH8oUw hw'Y?j^#xJddN_WM"O] lzu֭J٬0H V.6| ^LOsLbNe 'MOZ_W m;]{ c=Lϟc|z㤻 w)|Y& q5m#"0&j7ލ7NS3ο`]YIvDO.4'ŋ4[k1^lN7J J.4垘]op2^ axpJLv~zN^MGɝ;Io4'Muޙ\Xh=x49 @INox7N}iL@C;;M7NAy=:JF [m7l.'~Af3_FkNcdZr]{]szڇhu_wzU<; A1ln~G 4Ym0~^ 5٫\̓֏!WegWc:& LnAfz5<1~/U?F~/YkAaltG+)/m7و)4c*3l{[QQ26 'Ûp VF["66?~_Rjl+pcwKK.ѲșVȎ\sI9`&-T=1Etf;ů \OMh0c?Y^] \^/&Jۛ%%$:QK>u8\z4M,)q4W~KSgf9r=?n?>: QZ THp֔p^7tn5C\+APPD@ iHQ!A01V& HXά4)fJ'VRk)9I _rc1I8ℋȷV`D@ eYMH!fBXEv)zu{=zgPÉ]eq}3,+,]9;,i ZOQWJ5ь.i+9c C[kqHg|>̛!@_/7{o)Y䨣M"Uqq"יݕMӞƁ:o7=UJ}, D qTyy v/_n^L _J{I+KZIN5Q4&đ40XXmDDsdD#eU($VW_Ǩ؇n^4k X6-idJjM+4Lf0l9B-)h7Q0 Gݠ`$rIFJ+@q.M i3tm K M͸^%Mdʒ&HZ0v87>V c"]@:>?.dn:\}Dv U2290B^utTYgE1H^',Foe:Fm~ptĭ[:$teCeYm#ƬKBpc;/HXkGYCJ}꾂JYiٮ?~iBC+O%:|fg&|pv֠u[ojb恑7Q,%gT@rpX$QcVe5t.0!?FU+VO% TN-EN(Wҭy.R?ņ_>KK]_.wBÙ'6ƫ$[떾TV(OJz%-ҋkr{Ma"QFpOp)zŃW "w\@Zmq PH+y5a `fsxoDL,ª 5ʴ}Ͱv] e/v9qBW>[PvĖV {0 վay!RYNj* 1nXT2cmXRؘ5W,h3>rQV\Tā o@A?8%o?{%WL c/`i`Zk|@)5C{*%$A2UE_2mem{ԭ\, %(>y(E:AV +^$\(8{MY 3Ϙ*@zVu i9ªO",A^KX A&Bcm{\}{ͨmIb<ìwk2,2a\r7ЬEo.{v7.ިeͼiZ\ż| UL/YyO$=3)W+4w6M:~ >qR iG|چ}k߽wOCU-Lרǵ6uuuTT`Nر_RzCpJV%9E6 ]|{཭"sԫ_.~.%kmҔBzd 0 8Ú>bm~>CZ$"&TՂZjDP.1lAmiI#X't38I h.Y|=t{E M]Ǯ>p>G\>dQa4 xӒ7#CWC%K6(J7b:iv~08qD tF,qM~E}l09~j48]|p{i'M@5L>s1st/ދtvǽKz3} I>LjʣB={ݧN>$Wý v1?I`oz<= [ <,-WB*_/K20óɞMOxr2we 1ߏ_Lq5Z-rMq~ ?Yujj5a1[gR۞"GF 9z{`h2FlCm/IڼGS\ĩt+Rg ZG?}K}ZbSx1{_^]:K/)l8AS{| 9!4[+UiO~5:ߥFHj܁L`Vd.\>cg.>|>峷؛W{t0⷗Lt޻ψᏣ/h\!/}mM?"ןOrSoVxM}2_<_K0>L`_Y`Dڧg8 .?D\|]|B+-!s{?.-ᚏE݂cŕE/1U ]>ʎ,R(2^~,]v@ `(%+qقe,1%w,=j|A y!ҺrϮxsTr~ޛ N|F`cW&+l4s$-<'qI$k%@-u{+ ѳ|4Jʰ;SHm!_)7)eQJ1^ϯ?޼y8EܱQlH(N!T}>sOܧzS=}z&T}>s=ՃAKp (0j. 'txD{K(e&>Яܰ>)>{۝ɤ\9$sdFr[5Q@yvvOŔ?JYL|*$.5KLȽwrsIE$vwQ;hO"F!IXjSAp*H{z̵1t<χj\,R]_tCY]L{z \VJgiAX*+G#LG²vE:f0=]BҴ+p&}R\;-xR!@!$l$gkT P}t@P}J I^i*; tJ)^.V;J^'+xX_[#WH>BxC )$e182QM90P&RWseBwc\Z Xe &i}$Oir}d)זE+&Qk#(T)!%n%jbAԘCBUj- cҢ@J4L^bjAaGb6*tR'9^0A*-gMRnGJi~ 6DT$cԅ sZ1M$P;T$G!shOrў  vp=-Z eT@ A&-=dyY J(:iM%5SPD`AwL)cJ8 G:i-{۪;[m@J(䲨BR+ mV&"m繧Qde@ѣOmcZLުb,;Qb;!f%0wka(qʢ]rS%F~G"D R,ʙA3@4yCˆ cWNZd8b؁V#>L 7eֳ9 BQXDAP:L4]-:>aE%U 2Pѡ}JH^.* ,QEx:"R5xyni' Bq|.C/ .^/^^1@C U]==vќN^|iS|yȟ*f XOOULg _ͫj~CiP9kRJEC0" ~;ٷ o{{FbRлZkJru-@b8Ķrj+`7}z#y4OClF- Pvԁ˔a{ǬS2[\Pv )˦!;!(*q Q~^ -:GNQzb(hEbf@f:adJb؜kRbh-*2s" H;\ RЍ"xg8n}=!9F0&)OuE;$#LpnR΁!d>JT6 yXsHQ x]F[P ]Ժo]8[RaF.> #{If{o8z\a46s=.PʖBQ"uyEǐCr%t& ͹ҢGq7uoC^S3Ҩ&HkYY(S cZ;zm LRTDv f_[`⛴Q=ŷ!rRhD܈;ZPC0hЏ>y@7WHeq4hFъ낞f30d^e ޺FΗFP9 qFimpH*$iWhjkJq C)l3 |0kh#,s{- ޺FQo\;%xmD{dmFKZ- vb4&D]i?D FsǒͲhRIEFӆqe)Ɂe+tT@fn4t̓2Km+1`[ f= Ax@c1yΦvSZ |SthQL C+j(jKZ% Pr}= ͸.3\K*xnD4ڳǛ\'rll}Pl93-5ّ]ɕVxGnPy=/ Fh r0JT"Ws5%KE!>Eƪi* 1cYľ4 >RR1e.$OD*hC,-ȜMMv9xY ɆuEw[!!0s(kSstKF/C긬k8invp'bt mH,t Gˮ>yfdvWkmNu(k $(g,&ʵMWk-7֙:]`\ yrW>86|qֳ0~To_Hetm>w6|y wAVLC &f4VQvk (R&ks"5ՁT+5΢_qo{Ose_>^/ii=v`3 Snc@>h~vmI+QyB-|vZ΀?Ad9=W{oH6D&p`akzq M[rw8y\Ƚ,qUR\n{?5C6}?žu"pkvr͘P>m.ٻF$W.fK/kEamy(dwOdJ<,AUv[EddDdd,8E8'?E4:~G'''W%s uy5\X$YLXG,u HF,N k 0 QQD`7|M>HئB n']v_diƵ V.Ez`"S k<.&wṞܲ\RL,Ǐ #fRSb蛉k gqE}W>; -ee4zE¬ۻ5J!<5--ݓ(t Bz36YwJn̕Wi?t)ٮJ/!L EnjD'Aј6XX HUU fi;:4R0ï0$8eMo̤Tep>,oaV=Fph%xT*?7ޑKJo߿(bC|6 0ûz9]m7/VOemI(aDc,`$[3_ye/2Tt`}8yI9R _W~i:sm.A,o+/|tA DYWki;c,T0@E\ mJFt ( ` #uCjǖC_\b}GdL2%pV噇 %Ƅ8Gr^iz,Zv#JTtcGїiPo?ÙAi*6zɍ(ErA &~9=#?}]-lfW{*-YwVs_uM:l6MR&"(r$" !"p (zx"1qm^Ӽ3׺N4 E0qy]v82uQ9>^I_we1f$#A !0>v-S yU͛vbƴ# +RK `eru`=NvQyqW< u9[`BjGp Q: `my| )&Jt\h038!0ZLJ$\_bMGɨ'2]Uݕx1:2!{~V>H_)xgGK piA U\Bũϴ5mUQ5= ,)_n,S:] ˜ތJKSܠ^'N sCY૝a?i3#~2y;4{8 qF)x#cBV3ux*$\"LV-~/ ySF8} pfE-fTuluNa#BX  4"8aX]UTc [Ucp5tsef 8Um;_,i1\TiruUHRנ+g\#ߥ(J֮[c*;(j.<~uޚ$WItuzuVf9E*$kJ - *)wkd5TH>p3N'"|LXRB`cz{5.p s|v#q}݄N@lWC\NU*v7VY &.Td^kA``8K3_+4giwQT7>> # (䎅h)K+v`e9leY0H rewv%N.9k\ɄCQqjIٹsh~~gwh\8UI0'Xy%J;Vcs[Td1g=lwPb'1@Yf|E ȾtZ1ESOֱ :| U  %OwQ]]Dx_`1daEG`:V>68dAiP9p^TU0t~goIP7iǸD ?HF;357a,-ϣ?/_AҴnF_}\hOzF*J&/~-lI lc~8S~8}@ р)jz&䇋0fg$rVj-5U3J_'ϸRhX,nnkF&QO2V(5TҔ?RUӇ`,H9+,`ʩBoΊ 9 w׺ *)L8"!(9b^ArӠ4RVپ+F#-%!-D* ӵq:]ӵq:]WOת{Ir J8rTW{ ;CD(:b{IӉ%M'?\5]Ӵ|"T Kx{w~N1Ry$UTZ;Z #$;Mcb`j]ՁE6J0ZlER,xeϘdF LL=I6!\ 1}BVD,I qڗFXruTPnZ)SeL105(Ml@t39T\^%9(-K(ǐPJιp]sFWX킚ulmUvIKT` s#Z_RK`rT5uOϫ6/Ĉ \qή CڄDžǫLS+ z8ANpO1s<'~)o`MߺJ`|x N&׿"WoVhgF<ⵓ8wí뼈:z|QC~W $Raz ZqkoǓsXJ5Ok'0L/JNrDa[Vt0F!\V%~ffi ;[&D ~7E_L8Dhfd6jo8#|s808o![`Z>N' uG-?X_c<=d3?x`̿xh/^2( >:DQϿ'o^|B͸|A˅Pd4y3.,@&ɫԫ$eWLV̕+L8R=mo7tqt.2NlQa/dvrjLߺkAj :,r%iWtc<Jx:Ovb=vĆښ]n޸s+~`W6#[_q.'[Ez iȟ\EtJIJ [(>:FvuCY`F}['W:$9Mo=pnFɀW߿Y2'|Dwby1,|}kco\4Y{x9x4q|Q }8rK(V%ɼap)%y!n^Nx՟tJZޭUB>_/5٭ I&Eռ>0F!H5kA͝VMrV!X>źt?=ݗ;~2t(C&evIWo9Ek|`Շ>-eo 6MݍAt{8r|AX4aF(0AY\8}9A8 ZMaH*ԃ4T-G1"Twy@˒neہC yͭA"0=DS _屮:r*:pT/94ZX렏Fh>>~Xhgz h@s;[r,z}`^C Ƚ,Ɂ1zlN⸻Z^Cɫqo uM,BhB^-B:6EEVTЫqE8NNjE]W^P)ѡH ̲uq??)0e]O޺Er?'(c;ֿu;d@%N/%l%*15Ak\Bo~[BR2UH)_k_ ;ĩ}*$\4Ut{֮NƭSʾ8SRB.T_xkUR SqגČtZr{᝙ <<ۂ1O7 ^ٻeylN^՘7w;9+{DTJzԀ*Vpk J@A]q1ĚWqKƓ"H* @tAlڞ+Ε2G\%U1{+ԉX#"ݨ*}i}FijNЮ8>S$N)zMbSGt 77%5U.*e_] R@l\b3IqKKAfߏx!ei\΢'OV\4+!K?{>'j\c gw>WY,}};%k)!YL"q&с1V1,.N`9d}du8!;`WL߷X O.TK29`PFv`PI/~x)ueꄓeq!R!iܷ?1U]f%ueC8|XK%H(NM|$HZi15XLR,Re퉠A5S -3NmeJ):4Bj\wLT%N01 )іb($_!k7" c[ڭE?՜[FqE"r42J֜hd%fˡ?W; ͱUr|XՄ~Ml<2` 9| " k0-WB'|ny. sjtkãwO\;BFYw^ݑ:׿r҇/9Uup|-[1,†9'.fcw7ggwCU`,  owXꑤV/҇%gqe:kUk*%uhȳ6o䞧j6h ծhTE^ A Ce5M2Z;}̐ ';Kh.ng5UEv8 #ys/6s<:VKIWǖՅ'HI"arE)D*AF>JG3^6;@.%pKD1؛[۬ v{y;ynifIE7BaK,xYXc;r:K-I^>/mgWkq>g=`X4 ٲ¬8׍|ςΐ¬e7.o˺R琒3> &, >s2MF,Bш%ZFq#UcƬ[|TƉrcXK[@|L)LB8̈́̽w F Nc {iop{bOȈc8EAև|t [U"XH+k=ߎ1ol8~^-nv[pC2O#ń &*a()VSiMIU*bdK)#T$6<1??r_?^0qӍ'A@r˱W|- i8V_q6Rۖwsl|hB1ۙ5: Mg imL0F!'o1VmYD*] ^E*TI)R˟d'Xý(uA ,N~N7œ oͽC;~2 "QyH:(ǭf pk$W$lTlђs?vǢONUӱUiik}vXL j=tI"z%Aiڹб97[mG=ڿot0]^ű{HmhUhx).\[ń`L:" e "dUJbTfLLEJ,M%Lk>5j4LWj*[R> $ٓ?вCR瞾k~sUi^+ :QE7^O͓=I#;#wg0 oq@coEÁU .nE1+E1J Ol:E"׎\Ov$IցQ:'7Of/0:bб|݌N. :r{>,꿫߾Dq8 T˜B|tol_{hr3 H"d"zҤL('INa #ZhDi(Ͼy Yc҄gJPMY%z9oAkZL|`XR-03k%yN|Du13%9Fiw(`;&`8/" QI,l8M( RbLG=LBWk1E[͎ AX糦EEvwjᗘy>B:0Ɨ#2XIӱFغ}|IR[w;iiKӶrW'NddkE}γ,gɺ8eI|Ĭ$ ەrTNO)5V gt:۩2-lbn󾵂 M;WWa'j9yT-="G )D^g\.ۭL= EQ걕B^;v y򺬐cVrC-#xR @'[ ԰ xylrz\1'@j}ufqf?%>ܟDC2yt=rjjbħU>f, |)cnƔ[HĕLNJe-UXd‰D[SnE-_: OXM^.Nk?hՑo>r{ЗEtN3݋nj7J9K!RN&4S|hYPYhZzW#pce;uf轃Ƣw&U)v2|XD7)33,^m;༴EJ0+o:˃\Ha'>o|{Ͷ[{ͦl>|)VzXoiQS*tyR|":#0<ζ+uX@LL rV}}x"XF: <:0{"-,M4zr|}M==1R_UriIcFv]qusрYy jk{^ES@#_jJpsăܨLK+|JԿ*aDuz^۾.&>ؘ I$ET&urQWB!kW ~hP1Џ̏ƂKf<~¸Ym &RV*T7$8VڙL%K*.2ZɐU1^9#2&iʙL>$F&NE W6Բ BB8uu W߰kRFƕ@`Y%ᡩOI"F%GLbht>#,1^,if`ܗ71? j1;u9<ǮX-YkAN]SGb2{2tۛY٣3[M&x>.rR}EUyр(rEGwzV„kgk'"۵BZ7DܚВclWul|3ׅ֒Լ,ug[M=VlʻlVdb I6v+c8.+*Mbh?Po|w2wLͫZԃ5M=?›=AUD%Z0{)]:8U9ogmd^ʁ1j)gYu`L<%/90_y-{;tbC̸7+V\BB)U`jz |b_JlƃrJMҟkсAt(wv/@=9n@]KiBٱ^;lr#/~^x.cev+FV3@`/VSx'ϖҒB,FH6T-2`M2H+`E4 d&2>; D֌yءܿJx/zazWwwuzWww;vJbuN-(.mwDo pP@A֨ꕡT2g\I ^ww;fP?~Ⱦϯ>}W;{; ?~ÛOsX_=x m#NYfE1"q]궍vzBҋ1=/h#Ռ &gwc{>r{>aI0I (#794==21IF,֍ )S]` 脤XA:VN|D '`-P|/L(MgRhTB:f_؝IP" Cft)ҠON>+GE;Dsƪ;x\<!SȠT]9>zs|d'en.-G]]R0h2R1Jrĥ`BBk/ٗER=3`:|ڛH"nݝwZ]8ep0),J_}R NLpoB F0كh>~IngAtOsӳck$:Aw")&3=T)z )`vaAq&-ICZe,$֒^ݟ!{)cgH]+_ 5\m Q9JژX2dbD#VXX<ޮ*EFh@s<08b>כ6\_6&N6@ܿ X۞uuH Ef& y B$49i1Uh#rv` /[GH"%w S LYyn$e-Ex$40`¿(¿7)qr-գ`M:Jؕ b Ğ7c<{I]-8j2~Bׯ6u~x4n*dxAlqmd+^s l؛mLHQlDW^䎏sXFS˳8g[F*{Βakƒ/%KͮIҍzm@q.+,FVmC1ƭJ')dU3GMJ-|>FV,6R?c:lFH۩S^ Qd9Rߍ%VԩdoZ6?$a+>+֐ۛ6'_^}H[A~= #X.R6u@{4гym}e, TS,hМ4{<,HI=EIJ/i]$Guyy@%j vY*y'8CGpN魦\/5=H5TNjެ& gO )`dhҴ28ں^V#Y>nO?JmOooW)qeG'ڟ=}DﶺhOӷClSj~x!K!ޮw7rV?Tu˟3υ|a[p?[#(|g87{σ9H9L? Et@]8${} ;ttuT{n.)Tߍ-?uPwUlƻU#.(Jj8x kZ{r?_tvXɮҡ'Qw(^*˘SWoO=?e,Kqhl_,M|cu8'8^'W4A3mH^9/3#2O n3:#42/#mP6h -股]tR`h,Z eՈ'O8ey NZA^T6em$%.R@H:zPTZF%aΕ`pX*V& JJF>{jcViוFQJNZR]WG&6ST҃޸!s ,*T-*Ut`Z*8H[T 0*𑶲;[z[0kM XZmje%UHIɏ73?Te`}$`{Rg4bY$]aNkt+:Z|"#̱3F {/Qoc S ?k[ ?n?ϯʗO7Vj.7w[;|Iن9M<3\0<_pr/7]1|'T$1$vZow.n`~Ĺn\.??ڠ +jey SMyaxiL/\NýdV[^:"UX!du;JXX3tĪ)ЊsKyN.VI5z!yV a,=%iR%J (ٽtk!bAw+eWJJD/"MCۘxQ x)riS$_d{ gHz vȌSf ==HqYռIyJ=JɾdXGHgvS۠$E $H(d<2E4n4̡b CCڸ%0k/ruH40(k+8{)ŒP <\2*PA#StAfEl5;Zc Q79EaCZXgB67- X(G(;BќH&X}n3t_9/8S3C f;`B{.Ϲ*$oR/Tdty{Т~LBY W*|>./wޕS{Q\>|J B ΥYYnʂ]4B уTXm`4Ƙ%few@&P[G v| c_n{Cyd6G:1BN﹫ZϷGU^2{^wKSRU"^EM8**j[?86Ƨv'R XOfM-vk;yȘ.z/ᙵL[a6; up$xDF{z5R)ĽDz^ຣVp9=N_ fc032#0ܨxA /-, H`5pu#$CȕhD3EZPMb#pc:W!t\.t]W,k|>]B-]6)TLFw<` S9X=oo*R3*4R ASՇ](*vZL5wW8{_k RQn&WkkONI]`<2 ?hd6z u#?窊:7h€Zw`vw˿{_֭zNgd ~~soVZ9d,܍*Ehb9"KCl//1u+Fn鷍TVT\fJzV,~YaHBpM)NGýnrwD햊A褎Dj*4vK/ݺ/\DwdF !"z]zNq֛ =oʟ~pfy壙_Nz`F/cN>D:`, Utt_sctc>AUiĆ6k9[ Mng.^8.2r \A{VtnA8:kI: r'yGsNLJš£_QSsӴ:v/Ϗ+*s5c1d]Gcš^Q ǼH 1r1d ,E| ъ+՝Ѓ8E?⩒H"m~^Gv{dMgE#=q, 2F*cmՌQuY- 1^ݽ6)G璝*kp6y:B$܌(MWNMɩڱgVXult5gL8B dG4$R^B*:Ytcֲb \b ܧD胾}>DuCi m1'm3w]^ψD4WWq…:Pҹh%5AInܹZ9`ZPy V@NL BTT/+B5VY 1] SnGp 2r)iJQF(]`!ۮzz1\ρIB7'jAՌq#LN@q9ansă( 7Ήlj8q6ܵ/K Eى<\`T%ΧA&5+?FhIE\ Ij0qL`y;K4̻,7vqo%bK2A(@gdšiM{oQ.ŌK]'S;EsgGSgqCIry:6v$ N>HXR3)2PF76sn7*oD}P :@bXߑsp!quK:mqE3n()& 8T%&@-9coe9 @9HߟE߁يV}6wRHZ]!F}W;ktRH!uawmHAqd?RQW<6 =S\邀婝hV 9m|6)]It6aiYX8Mܽwopzv2n瘡 ` G^Zl05v"NESnިV?|~,p~|*]oLR;1vSӺSvW[KƘ:j|;")H7Ep:\^% i0fx5hBsdNyFυ6$/Vk#iŢE8u$N|G>t DBNR|nMxY ӷW ʂ"kD&B6窿sþ@4l5@ߦe]1s 3=Ox¦e9EƬ+2Km iXkx{./%b12 tN`Bs`H G$8\<ۻrjo1j4բ+"O2D":x߀BIPM:AC:AAf Թ>X'XC)GR"[@I[%8THЋiƱh&2q"!2q"DUspJqj͍B+thep>ƂVx>2:PL^9Y畡i^hh}UG .nʄhaot5xoQfY=5ű44m 9 ?o(?u]f 7RU:'@AHO̥A\*Scy*U8 erVB+*JUIeڝے!I0Irj̎Wȥ) LdŴYsNqJ G(9 ۖUq[Q :T# 5vMRDuh%×$7zC\!NJ#^r:c3L\DUrC$6z)wX!ĭL4Rj "J0_-(PFx * {O3xhQQse[;$HEŊL=(I>m趈F1H)ArDSК s ˕ۼa$r$B aE4$&.욠6n1}1ƁpH%p7 \̕(` Bтs irTnS0F kI*|&UAyQ«nx IСf?p(0x3b"[7)E1ߪd=$ VM{qa\؎}ALn.Ea]3ƨ!z)׳5Acp7/1`yky3lFmPaCӊqu1lKGw麟gϓ۾<ٞk}ND*G[Z=8Ƥv$B8wZf<68v?|d[\~pvm>4zċlG^d,2Kb:aF{wΕ< YG?H``W8(Ik0ک g O_:={ͦP` #M@ÝL\aA[Uh]XδT¼̕"$煲LɀY.@a^9lV9;f&#{ ʓřHjfTWYzb$W%IFQ0KQ{A4W> 5F:AU z(M$h7uָ-I MVF` %4j\ Kq!S^\“E #f)I$ 8=W|,hBpLiA)@dXnqaa+Ay֦yO  S'xOU%d !ݣTs5e63=#k`w>K_uD=\{ԱKAG G_ 1. :X==83QF-u<Nb並{]-bsVɣ\ Ej@j?S!Wd$ zu2B%=cNRw'6O5Vwc;J^zxƷJe6faܥ5 1 1f]zz'zh|y' |͈ÄF6֠SptD/$# !f4z5E ׷ބ}MsP'n:oZ/or=e8 J \0yE%>C..d([9&P(0Y&q| cqorYLeWO}(\O˷oKy_|V7.o~~s1e~?~q{LX"$;1s~D;3Cwuyf@5^cͳh*)ΌuVC^d)JřyM~p Xsi 'Zr a\S7uДOyHҟς[;yv'B* 2~.4կ8'.irb)OiB:#"#Lj6%U. EE+5yv;jqA6M 3M&t2w:d,vvxh^;z7񗇾̀Vj#4kټdJ1!7G(Vh@G\z 舯Wz<hL`JDܸO(k5&-/l4%fc[d/w,h,uzsX1SZ ώA倕Z}:rZT6vbsF_ٰ01PbLp9?m蜟Ӄdtӌ֮r4GWOfB/^_gD⻫_[?VPYrݏ^wN蓚!r!rC+d'C-?e/~8Idś>~&tF["7mf fmf 歁Ɖovov6+ܫ*;4k WO%cOt_4,NK7 zG|On6w(%W1bs9-c_LaбBe5Bp YsIJp*S#PplBQr덖)؁7;,DTW)ͨcq1*֚yrkSm<Є$+Iw^f`] %rZ D ^"BmBd65%re@ otGrV `+݄n -BM{tEڥUhEZ܇rDwkҷ xbK!SvPqM^NXb6A.1aɏ<]D ۗAm87ꙸ45XH). ps:bO" "O)-Ty>`౤[6tJU΁dB Pg鏆'Ĥ{G0oetFvP(V˧2U; UzJ 5WGYF,YݺMj"!q}H̟Ѻ[ݗbJ||=b^u8B3sy~f)n;āk`0#z28HI3= d>GKxMAPQ,KPlOߥgg= |)N}(6$U~޼~SWvVGvȦ:>W+S-u#B(HHPO`YB,ssVn#s Tv!$uX(ugp}1_Ŧ98W ؍ɕdI**L.%-|yF|_o^D%iU%=ݶ)~+hP1Cj9AK_llHdqFD̓P31w_~… B &)S".EJz~ O8`Q#T(s< jl:1ߝ=!EX/ٓ! i[Q1:m&m,aq#C(\cu֮amd:D*Yje jTӤs`nZiuQ33: RHиvN7?XLAD{z` Vh8Ebw-e$֢4W YNm\&HVaqJ}ǿYUd{5ȾjB%eaҗ[i%P(%!p~%R%4i_o*d %[ ,&Rm/\l^|ݪsp(Lm^_eHe]u8rL^v<s+Cd5'X"ef ƞ8T<;-@Guc_V uU;Ox Vg!R,ks aDȤcO<0P)!JHE*j~,\e*#i*V1Q\Qp.: O&lُLْ(\c rV0]z 9mct,lYkSLVY.cYJ9/P'3kf1ճu(>)3A3:Lx4Sz;{"e="w ͐v ZX+Sƙqv͑i'6 xLj Lm/)RFzBH]G\΀0OR׷}0PM3*C8(. DpMdr99gV \X|ChҧuW^LޖJ B_YQLc*tQ9 +]+$1"уQ ‰a$wZ:wə*q?NR/~JF$(xicmm$)k͖]꥘,J7 ]WPmSxp 3)Z.Ro`2(~7[ÙQ:H`dJLrL+!. 9Σ!lWxyiyDBz =U$3k#˭+̩9,W\i%8.1*pܺg8oDŽ7x}A~3ho߷uw+$'7Hl+~ji" 0BD('aL9ޑq@p[ÿ&ړ:]}Z+̃Fg|Bkq"2Y72<Ht4e6N':1ey"^w?ؼ oURא7d@"&Fz+J'K/ %-m^܇kwJ587n#hES9s[v} gCBK)Ig[Fh>zD5G}XafO`ޚ2nrlgJFz,3((r1%z ;8F5ߒmUV_f{qWw)N͸3ej*.zH/-丳*C. 5z%o{%5Wh5dM^! .ĕeH9|j)Bc=75>]ED B{l]iǷlLh*F~zn:Kz!yݼ`+ǩ-S:Rܶ1d@#m^l= !"JYFIpq In;nKqUR!KMsh{|.SlO|y%zf5{cǛns6U10gŧ \R쁠6cdžkL8,~L9}60 K\dzݞKA5o=+.Gܵ.|uy)N[h/h^^f W&VcWcN$W"7gZY'cL}~?ųt?|{ߣJz=4z9G/i轓>F-ח-mKv+Cg 6+hoỊFFO§$ h,<(HDI@|Z1=b7x2EHVbTAGR?}\bZ FiZ!]=sǹ~qTc#Ɠ"%T(x1?`XOT^7xJo0܅fͺTehI˷K 5^ .E19&El|I> 6A od 'SE:Z1ܓi&yF2A\3X0BPj֖Jt -3*?vG[ʸ.QMmjT i\XZ~U*(ގ>8=SP)=нgpԺ}uk8$ IOb*6GݯkbHz 7Py853nu\1#NӔ\Sɛxŋ]*"z~XV-aʊ4O_&[s.ݿJq{ )"DD0WRA?l?dY+c3%BPLH|1hG!1bqŒ2B0tSmPm;B)q(YDQ%@a0($"T(P%i'W8 Uئc̗hkYQA9 pOj)pN*XkmE _aНA7I^6P*vk#Kd;%ԅRQl+hENչ*3!WΦX0 IeQV4"#)L;Қ 6X䄑8i QXpI.ԙ6Vԧ g3%alVU8zPIR :fdcm12bUX"uA"h,ay"(Z L&{"M <*jU}+ gͽ`_nBé19b 9w8əHX `!zʴcV wsoQ-:O1.7 Ldh]ju l! Ȉ>s9< 'cgaq0$H*=XIl&W|s_P줩礬a߁_$e >%q#>ޏCs,k qQ:X@|Kx#Jcz*JWoWd|-W --|я;/=8DDhR;F? bSW)N?#35ˇ\ 8A}|,Uۣۿc5T3WxCM_pܽ)!B*E|^ADE/wBn&n۟TasQo&cx/ƙl T1i%o1æz!Gb3m.8na,z>V['OcE$ =o6nH\3$(X ,.7L,K*FYP0`Wwo 9Ͳ&!pme{nH55)arJh@k,cHn TŒ4Dc:Lf˙еRˎH>YjBMRs$kL;V5OY9 R; n«͘<!ÆSzCA Ew;sһ}n?fwγRH3zSq쭕*$./&bQG)m+!iNY[ !R -Eh' TEkwykh:O۲z֪7g9biq cgX,Gt)" B"  Zfj%WC,<}/zo>)3yVau].E]Xdmo^!~BV=ZU"Yچ޾yAH(Kkșd^ڜ׳kyk `kT>ڹu,[XʇGbWtӕ5}O+.l[21G'krn<]mȵѓA ws)e +IKz=d]۹ q]bĴ.7R5n]ֹ ;kE)ovf΢Xe<`[6_,щ%,20HSH +e(Q2JNP-{BurpmmX 7Q ,:a9nmmqGl;l8L$%1uTR+-Q.h 9"OtfʳTCa,P ?0|(!Sl&|`opJO% `,syosbH15T6,%$iW[2<aռzG!H^_8rE.O ۫'N7aQ;:7g_"TI `Yrт2˓`QDC[ lGҷJB܅.JSÈDç'v* FWqMB2"'`&18G* BZ TT0%!Xien&t6p^9Rt"RÍC7 g΀Gv2{tdC`z)9X U)Khk/mV3 ͏1>ǕQh>:6}>i:?.&na5n)tHt~fhuqcXKQ,ǑȹXJ!~\+0"n.í G0y"{nJ`wZ6y8{5bHR&"L[̜LGL~4Y.n[זߌ痫ڔk{8qX.R]K_S83v+;&ѝ/d/e` ?+6g2VԎ*!ϔK% OE+AX%HK 7CNpH{%8mH_\p qa\i0dH vjưFރW W|/15@t>j' 6f=@p S?E$vQ0cwҭ\,%zU+I`D!> ۜ;f2ݱqsS v0A=2ȬP3;dM]:J&K=v4~oظ.ZNZR?ldp]`Ɓ !9xs 5WÎcx8#[c$w]3_Ƅ̮1*]KlS S[P',gQghjsl}H56cTa.yF!gmEcm)ĕ"ZܦHWi ;Pmbim{]SrvI* [D[)J\Rl^m8j>Ok[ן r{ ##u`,ܚ{#[LETnP= j$I!CGk~U5"i2B*a_ARPNOc =pB1Z;Bu _OS]lX595ƴl;m!(];3#; ˄``k7nM@B Ghu;jS}P؆w2,ϳ9xw!Iʈ9d6yB-iJB=ʙWBhϸJ ].(g{t|g^9\{gHMt֡ԜVNoJ-)"eSD* 0~r%HsWG"ĒY.40hHj< d͟Mpr 7{oˏ!Fh$J9%e;sAkn6wdž?k)vjAPi,Vh'w^f57JU/ur k>iTBRo7RQ+􉪂Cn0{N @]8p;$hA^^t|i `-vM=%xA B\cJ W1u8;_[Mۏ_K)RS-&a%%-HB S=~{Qki XI/LRZ)M1yA1=6ASD`(4UPR(ϱ>xQjp Ggdp>>GCU!({tӊhE 襒%;iB+R$Ҋ E8`p|q%wĪUe W7Ofq3؛oB7Mn1e+BILfHtI K/*4}(-z'XQB1~ЫW7+ N _֜FI6$ޘKy_ş\p)p#,H!|UY2!{E1$R0M4gZř_as Q~lj2I*$"b˦濟6pzꖚز8ြ \ɑy#*H 2 \xZAz1 &$RPM?wQmE>KS=+%¯h`ŶccGtO 7QT]WT"O"lwK@>WDMmΈS_nm>% ?8>l k6J ;f.ot thOR0}uĔ9;< #7Ӏ)|^ɉ7pp8}A;]6P{@C & bh| @Lnxǻ JqϺkaKˆRaav;a9Ӕ`>%l3_B@D vMd'ptQ73e;A TioAb pڞa1SFhI0q5hlٲe[qَbPWYw\i~{`ѡ0J7 /G\[Kvpv_˿[5M;0h<b_;;zvttwԆ.{ޱ =jJ+j~)ytS3Ji\䱿W=q|qvz|3_ۧ'՞9;j}CW] .|I6cG˅vz>r{!I4Snv_l.证~rd|ďh"4~еu~Փ:_cr/cN=Uh,>Սi>9ٳDaߺ odv}|()S$Lh= W&W'w.h_:=<;F=?5}bS]J: Jawc4RiLYU~§``1?L\x9w?{i=oj/g7К1)G'~0x:3zu$?NN" O΍!B'xW'4v>8=^9 }x$ w?p8O 4 {}(~(w2d"̏` ί~dGuy]gR&fbnN7U A?/" 'wozgxC2gSOM?ns0RH )hc!ڠ[x6Å,9C-հn.1䁻hcUeJFTdrsŊJAU6hBr͂iFIQC-UEP YvDM4l*4jԫy:?^xY  IZ\>U a;ی4QQ}aBH/`J5\ғ>.^aFSeU{]3 JX-gUB)^G).΅S6C.Km|\HaL}B*biHW LSg[)hTl1bJ\S l=^n۫n5ƶBՔ.J]@% 6J3DKbI^wE0| fGLS]radx! `ڬ=;Ҝ&6BgB{4yY2ywi y0&*[A}-<\ظ~ A%2v.WR-4o`ѕS,\.H}#*c Fn]K=JTIJ{Dq0k j@\Qs_ܳG{dQο]RP-q,? U6dF L XKK6ߊQm%Cr\#eGd%Lr_{>xFhmEo`c JKABr)GVG>)tF[VZz1W0b F4?N#lVA*]gmŝf;͟#M"6zwқ0<.npmE/o֗ή..Myu(=d~1Ϸɫ?ܶO\Nw95#r/'~ONϊ@=uk=u~Փڂ,dr/cޚ9&SgALm,;٢cts¾umVSXuAAbG5MQ7=?߂p+NZW#[}k/hoك+#T؂I{X'JНr@͜ݸ5sEbOT$:ǗN|:N7=QxM_tʔx>=9k*V>l{!JcW2?L\x9wćN!vA_4ϯ_No5J)G'~0x:3zu$?Ӑi\m2]Bi?nw㨖رc>G?|9@p <9b> vߎzaySL4=>Q(;*+Ju ?k Y6Xg{yqĮ͌h]}yth>3/g;>HfZ3*>y>?V 6SWH'P!8x6xʖ罒嶣̲3%jj+O\qHkTVJc$z|TDwk)^hӍKT0L x\^fvOOU$( iӉ+c'Z'\ȽR[}~rq~9Igd~Jxa%[1n9*fDu}tdiϥ22b}{0T֘s3 1L|cN17NzbRpjΉEL # 0L b &)&F]AOy=5gvrw~q91hwo/.1_^䋻|mˋy;br>@$ J1$K8 rD*+%q@Lj:cC41o5IF[pjF И dlN|^%葢bW?O1\jsi݇#dp"R,66e)anm쏼#pyG;WlY0HӍ?҉H'ɢC+:9 PqnSe:D_0*xkcB}ѓ%yYK 1&T kHRj#(gt41Y+6RE#1 )ZN:b4.WE*"ɾ5@FzD^vMVʄʄʄj>R8l=yϯ}E[ e혩H#4WR@JzRxH":Nu;7U*8xMB_%:+8JlM>ƅU0}>o4oN5OOSs)b/W~s*pn^{5}s_lc?CɊ@dgiR }]Ʌ7oLs%5؇o?_f>0stHTh.fKiwz_ژ|:N(¨~ /ULc4\?ݟlMh Qa!Mh&YQjD .+ G)e8( PM ]:!)SX@pxqMla{rD@ڀ4Х=(DZ#)LIꨱs(QWw& A0m ȍ@)EĜI $&=H @„Q.r42o=1RtIA% }tG ſq6ٯm(hocA$/jݏGoJ.2zFJWUDx+^+7%Ojn&uo ꅷF!tޛ mJ6^0sn.2e\#gSF}^,жFaCp 2uDo#Uriq. hi!Ɇ{ktwT)A;~rvvF(@Ԍ9vEPW:fkUm2B#(KmZ"AD $ge,$ryo{TP=*mx{O7IN$h 1v^S`oa1^6c++_e+_e+_e+_|3)H&\Kpf`>e" dLX'\}&e| _R{`MNȽ'VOFMM +c:3k02$ݑ<%I"25 -gd]U'a!Qʄ N{>!Gl4MHLLb,RsY3S=^p(8A&N&&z"RB41-,$YR !yɠ.|r1Y%^授wO9g,3IfuIX09p6}І, H UFK _j9g_ >Z wX,P쾹ݧ_ֳ/S{]]sMUsDUk:ju}>uHf $Uj=xa<ْA #[4_ )@n-}׽J"i%H:o 9E>)#BN2De< F(p/eA'8g ]mB{c͊ )'¾x<Wu^-ƹ >ol*B; R%Ũ栩1Gx21|ڔg)u{se)6Nz1~sV԰!31י8cLA5e3sF,9߱nfE/EG`2߬iJ;C /7:xkYZASVF9<0%G 5\Fu|j3Bq}P\c/qU_̊E>J.kS|5U/,\ξ6w1@pw3;~HOh+z$oc*uܡZ{R2WhsֵEKUBE?A&ï[+Y{8gu+ڹ،lDy޿e1{x‘ư HHBߴ]>[%=C b>nnz>pz~ψ/5o:Ŕf%HLZ+Ó˨nE#Yy, AWPO7tI+DyݫO)i3q5SKh ΰWd>ww8ר+Q46/ϑj8K}<$ǕaM܎ގGO.}84:'v{ŷՃ ՄC3/q?@K lNƟq=aZ4}ٷ׶#VLQ6~(w^45qIQSMцs)dΞ%2s,}{Ec||$hV|O篔{oR'ʡIU]<8}aصSƒiyFW>^+a_EMFe<;[J P"^bW QVէ[dks &9s܊!/D! &(T0X'܎fۅME|z]oN3mzaܴ)F`pŖ^kk[Z=u;<[ʛ{Y% C&5wVe$of}&/d˴ݟc1/.{|WEiM.wIr[8,&eM>a&mmlّ$(2B))`YGV}Fwſ;zmΘiĹb!rء,+vJT.ݷC.6SL~Nޚ݊}ݚbu>vuw65=g6zȅwF}KXa6GҩsQ& kB㑠(('*ETN&PlOɋBwufM# %ս~)w&A 2bOnjuŜ}yʞː-&Bǟ?ce9DujDEyCs*ڿ * 5⢤~uM(e_x4kā(՚萃v+RN(ϥ؁;dR-9Z~}Ҧ4\8yˎd%rqeŕE yy-4`|-_"6y7fL6^&;NF01,~FBrk޾C)B*%?}_l@A%GFZ;ޣ@c Wdi;= **NHwTE=鎄%N=sDwI]A|lV1.'::A\ 83Ӊ#4d)jM \,+nƳR;bG-1D[F{BaxhM!ZL\֧aFV5ԡ Hk )Q.obnk:hVũ-\=O^柋,/*bA-‚ۯvr8Jo-r$ol l'zU~=x !vγ!uk۾ oa g +g%=)KLlj;>b"C|6po{ ޺u1Rhj`7ҠKXpB![8ŒjcvUCzZW;Z+V!U:6VJ+uئ3(A\)d|M猕9[٫qv\5WՌj0/Z%wqjՠ4pkߨ)iFI7rQQ`bUEJ^qw^:fO** 4Q44PZX%z`n<~ M7RHV:lN3iK})E`+IWIγ߅$W:WqL7UcIv̜4^pəQ-32PyL?o@}*nm#s]ztZWZ ez !KU$MTvExievf' ~̀͟F5zh8z Y w !rik.d:3C@Ԛ\dY)0}׻Vi.bk NAfFdn@͍d3ؙ… i$8F]ύ}}'ѾN_}=qHk=g{ɡۉٷ]wp1|(lR\:Dquh]VI|L?&|"Ml0ʋ%؂M'8`MR-tM u6F~&+lQ[0naRqkDsFfg9!(Ao{Ճ\h+!GL0WZ"}l^sBfgڄМh`0qt7YR7CQ<򦐫+u}[[܆H5=}R[1ɶR&j˥0TVPAW C͟O j5pz:=ރ ]\9hKZ:hko0?a'YU8Jb.)JG$2ϥX)Ŏ*-V8YG 762ՐcrJ}ʥRG*I~G 8"/yЅ ;EQ<g/1& i\ BU~1ℜ̹gc?)ks"g\hpZLZݢQgLܸu7Gߨa"ZJbu]A d'@Zpf+]NfS7"!,9C#sqՓ}Vk4y"s-K`Ș,[Z* {hc mRQX?Zbus'KQnjVPS'JwHשO٤83N[<7?{j(aOhx {%$'ӌi3=QHJ:rp w}cqŽS-bp c2xP*48}11xxxWED~{aERgf C֢+NI5ig * t"|iL:+١weׯi%cԴa+0+< .[nD]b< do+F`";W4Bwy*|s}`CK3Η gp@<} оv'.Loѕ}![)#GZͩ#tWb̯"͞IaS/ y'4$ if]Le^yV6q)N%x?F іd`(XB"͵pFƵ"T29Kǭ:$ISB2ƭ5 (rBig|\}kr!89dt/zW{w?ec Fr~`*6:28bNIdNp0:&y*-xSGS8YsG33U4\T A7/+&+zӻ@mf@,! b/cfqy0\ʏJ,uMi"l؅N!vpw}NnGѷEoCp|qfST c(i7Wa>fn%x3wE9y+M 'HV 7QSUYh뮿<  yy{pS{;Z)?Ǹ^fϭ$[ՇkA# f6Q RY) EDx55p"ȕKaMKz9 !;g+Ղs,~w=G' ]xj} AgT 8+s0,v6t]C80AD^30)Fq'C%-A7lo/͌ $QR*IΘ59?Ǎ$9_5yc<g!Aڙ5<԰6wG.EVxH,fE~qfdddUTς!*EA8h'"(2DwD M1IT&GSϨjmMj+V΅/.8¡ű+ZEbP^DX <|Y[&ۂAsAZtD+θTp/QHN{wE!@U W4FɫnKL0T&Nq/d0 `Kj50`=I]6lwCZ 8H;JVݾK7JsM8ؾXE HDw\»kPnD5I(GI)\oVB WX;i$Dä1y\m]֩AL]D쭀 ܚ@ye H) KŽSHe\2) ܧ* wDR*5-ū[*u.e6a Ģ-~q{P8GXᖬبDp R)_*8J1 1MemYcPX-Jiӝ?vBnXZ^,EsZfZEk ȕNXf u`UްPP'ū.E%N] ڝ􅩕xqShrk>}S^_5%^qvѝ*1eWH?|D:ø}{u&(}Q$GSN:'f+kP-m`c\dZGw gwjhB;+i:t&yEyc nEzd8F5SV:ZThE㋉;ZԂaqW;h^P+a:j>GqueEeʕ8eþ(״5nrUr՗Ѝ[Qm MNɰy>dލ'K1Š'|2}jiPX!e:3cΜ \ OY7a$ SMqa8{x_f-F}0m}P[owvl&~p|_w3|'1O#~R{?Fth;z\kE]E" b6Q|Xr%d1''\h2yR}V>uݓc .Wh&HQ1^4"ԫVO*=\"b_7`O"N:R-??y`|9I#aKY铫n2sJt%y:] tzy|[Y/O8^ % ,lKyѓ4Z)/ɴbW,>LK|Y,< [uŹx%C$(E8{fQ <'62dNYlTSNU['D& >rd4w{QҿLJ!6+Ϯ0:Ky+5]WWUoׂ@!Z㋆of!eo^:Q\h}8uAũ~sXQ!QRQ!^z5(Nׄ`$Ns+RB[/ B,DLVTfsm\n.ݮ_J 6y"wċt/t4)Pq"7,uP7ГL8w]a \69^q͍Zȃ䬻sn}pN)8t[iKtvl.)tJJNQm)j:CbXt` ua7pp q*̟^l'WK/+=\gNF[89ʑЬ F6Ǽ2#1o"3ks0Py6a3k$zy9QŻ>89ͷlSvW9n9UfQBbE>p?[ Ns!DxYRĢ=ƓfJ;8)-l8M)YB+^ߔ$$cAu}?N8jE821ͽTs{8r/voUnUPsQ00mk1'd;Y`/| P/.:JlURmWҊT,C1™Nmpƍd !/I fYquM&5֥T/Ti%(?zu!8`:Kf,BK,6S~N+rUAc/h%/UYX,ךĶż J?srp_J ot:8? ۬(V;+t yPS&5Ob@ ql8 v0 5Oga3Ԑ%gx'oԏ'ߊ|[Y_h< ` o15^5usy0u1>D>q)4E}yuq/NMY'yq5ŏ?|mD4ǺS= Љ9bҹ--Qh/ |$e֯fN1A\sUl{lq*vb LVY#  + ,fXn/hZGpi+erIHcw^MA꼷I+1^zi @`q$>TN -]&0S,s F4rv5|;| Uj~K< %*:dV.j#FTQ6:t#$^Q&k4]l^=gȁ%I\fA %Q9XĢG%u{X{$߾'Ue+XNg^|yu2n[tTm vAs;zsuۋ5*oo_tvΎT$'O{h 29Ae-N۾9"Ji%ZL>H;r>@ DUJ#d@--';|q7o$}5yj )0y~Ct^sfSly{]}Rm^>aJc tvFeϞPAު?}I|0G4Mlx ;Y7%UTtA%ZK-l6СuwAAlnE oLnB U Bd4(t͇\V>BC9ĎCdGg}8kO ]DcXu` q Q`)SQBn‹s$obWLUI3ItJ١"oN=u#ʼnd5߿&1vvBUWWj'6(Y;suHb:+)];zdna5 DagT5ms-F#Nyl# fk"^ʗ2$$;,`Svs.PNI/B!w=xȫ vmx^#FY1^^9AeN<;AnapC X]5xJ! 7u[Q4D;(xPNfSb^|r9ɹdZ4ߞ-%tӛIޗnʏFRSK'unVd0YhʭLJяYDvO.?DDy:WQ ƶfD蘍c뫎%P5h>Vg$;\as,%tQ&1McUBZ}wi Ꮜ$aY縛MEEWOZT:Us^k;1bňlkGׅ&Nka#wvYVb_-q&*BXKWe3 W?շh/y艐oSgš=t=8YW|`յBǯ, :UIQ,;LkKɡ _mzpѹ{spJP$z50#]xkq%z2Q-J}TW7lXc6?0qGbbNS)N_- (FQ%[$uzdUz__Ġ3 W[ Mg0#T>yk+=xA=SH r,_`0oK!v:n3Vhۮ xjW )t?=Z7VkR2TNg(q"05D5B2A?7Zhp1shE41\lT^.5gzJ2%0ؙ}snA4CMz JKFbŨ@>mylz51H04J@ hBYȁ9D8yAOS8kC572pƅԤƔ+θJ% dD/;?w!(V8hr3tL鍲.y-j&4"ڌJT*#Gje(@ AA S2kBHb` ^t τT5 xxڜEY /JӋK2|yj}뷛HW`}^RmFPI~8P5b͠u8P5M2Mvs%jn (YcL99$h M@W0 }( _”dN02M#`tq6yCzǥVO=J2AeAz-ei=sn=rl_&h;cAt\reU%;m @MʸL $z r}@k^g'z$?dN%jvpB`jDD:ce'O# Q4QI\"3RQ{FWe+c|H p fypF 퐲U5Rh&H[(Jx|ZPyw)o}}ߺ;Oy5aW;n6Y^;+~sFLV[zoO.΅2vJ'n|xnM= >bĨ| ۛşp 7-dD0mLҎ5eH[n~~v拻}s1M߬&r7gd+,Y)?"e.lOږvTTyh.4{9U >S5˂}3-8ȶAfcYj{-Lz]I=zu({sXB ct~NFgZq3}_v4+\\~24~McWu>.LF>@pKu&*cs.=8Kc$/،qf@V6E}X5=ޮrudcc*,`"3"Ȣb=ҏ2XMPdE2l0B G שQCfeL!κ}܆jضɅd`Cɶp ٶ9a`ceO5z6UqMC҈pde(<$,ڶ ^іA-=ƥ3E)Ԁ36{rLL2BNwN1D^q#18;}{</v@sҋO3@%G3̖ ?>_N\IG$5Ǭ>B-e:9TC`L n uwT;HQ l$1#{9u(V#c:ˉ!N^O~f~z{&+ss(?֚[Ozf)y-G|ǒ򼠳0Eo?/7c>(sͯ$..Ŗ=QVong9+vkFk1GՅg|,E)Xґpm#S9; ϨbPEtھv{{E l0>vBB^9y leU61Ph+c#@2b8ӌu< U}B[9m h9"q\o.Rc؉S&#Pio6Xnܱ 퓱*Z@DLJa@pLH4qJ{",)/W*lObkE4^fUDŽDI!P /$x N&ĕ\jT屋VK(}p7^ -F 1 +T ĿAJo7^RW-mFl8kf?fU˭w hj#P͠Yo;RoG 8\\OR$͍HSdb4-FK70'1:gС4c^9\$RCĵTK` zTwZ-0yO I>M~_FWDyoʎ*X(V,y>z3@ǨUQT؉)Juu6ֽݛz ODĂ!~]y4v%PsX#ѵ{vͥLE_,)IToa##nMZ{*DGԸY?ɖbEoC)5OZ͊6(L/P\˶*Jy)K {FU?ét ΍.0\GLQ aTN2DZJ%h2oE/f@Rلw9Jo=G4镟ًo>{ o5VG+ t ѕX9?{7{`1sf7q֧)M3V'&jeYzLC#QX[)C$N{/eI329D8iK qJqN*|G 093[!~a&5ئ(Nk[k^ekεD+>k],VAaDhoP1lݠȲmd@MGU +/n88DԛP0iZJkR!s^#GnL t]fWjeGPTx]\3Jjn L̮!xX~@޸^#Z`z˨p> *Ə+cFn+625D^`V~@k ijƜzgo 9% d$q0kAtbS 0. G)ex26U%_7}zHs9j%7u_B~0}!P:ؽO5Jxǘ!+Z Q1˴.ZY TVyڽYÃDYkǀ{s9 MDs-9`X9XNFWU\0:\!^5X9׺VZO vWW;yFy^]^WWe/bH K4 ELR #%K!4j'jJcA3$*ReP aɻ(uBUj F10 ]w . w]o>BWZZAZڡfAF"9,5hZDqP{ (Z)ăja^t}X]t~=]kT!%^TFGLJNkI'0^gAH&P(C WsC; `mìn [&MXh[,V<3.q:ڒLʥDtCs5Th Wk. 5N| p ೐y#h-ؠڲa ҷn(@ φֻ(VljxԊ=}-s-\x`b# `J8nt$h+U[q?m5=`kbR>!g̑Xg8}*9x IPS}?%`ٷ\׺ig BjE;L%p {|x}3^8djT]ާfvkAi:9sZP2vk_!ZQu!!/\D[VU*aJ9N0CQ 9 3Ý@1hZinA5u9^Y6`11w0X x[\ lDn8Dk̒h͌vIZK)^1IIZ8B/$2L$FQzS)4JרՂAtQֽSMrST]Z d T۾N{@.zU]H V2uvc ϨbPEtھv{sV2i+j.$䅋hcDE"gZ DH@j2*n󘜁RH Ij%S]6YN&O:V" ^IfpƋN@?-Zp'ڀOVZ5Qrf+hT0BooY@joM)?BʿߩD3/ %hxO (5!V78ny(*_Ƹ1ͅE3?S tjdM6J*|g3\i) u-f^T$NX}S<f?6!Alj-S ;f[!W<Y:l*xlTW1 uDy8XbRp5xxٚ&HAB)cX@4n(Ze JXk`16J@Y墧κIQ> Bs7p6IJuL*2@*iK.iEUp&`gvt0sQhMhɅ3C2fIdqin4YW4Op !D.<6>ZsI A}$rјRM/m&߫ M$1 4˫r[dGg\]aKX^s}|6@% J$MM98*[˪.GgW/_:"29rX5Ls-FGKXsZ8b>wi( eOy *EZLj‚FZ1O= L@=ʀ qIPa|Mu%IGL+k-/WV gXvԖA1;)6dJڷbJz-L#R,TTXwnn?y^魹")_%@=,&?O(Gz^VNLJf@PoLvz>L@F+-x}]AgQ @=cjBHHo8Qq!3Kal?e2#{gs| J!B+]g].xqDh-6İ4r;j 1zo^od9x!D1(@Gspr~ޘhb&1Luw)͞x60 { gHMH Kkpd2ײ[cY>mV|ZyĪXc)oW.@.)sѦ.cjĺ]HW.92E19z|]Zկj\ RD'm-nϞhvBBr_š=N36<8E z>ؐZ"&Ԗ=2ӏd0|7LVXS; k*eIe(64j"]zDK9!ʔqZzJڑW;<09TS"|ْpb}?F]nGw#-qRIzA.7o*e*fl.A 'y|_ 6IRg'jBM>7zo2N/Ƿ&|Tv v,&c€o`EIVZzx?<ĢȘBZ*pad&_S6 lX-_0stMZ%{:-5{rhnͮ(]n]SOXt]G%"wuȔ)ر-$x%)4 ADOkC{GМx b8^KgHbZ񣮑jזFԿ+OsTO*ZYocgΔ1;8A:z7U)Iϱ4H:y1B\}\p]9W! :֍~ҡl!֤ZsϚopuǀmLZ#n(<ثVX4`"tl7E?NٷLacVQ2k:qLrb+A"t'B5(네J.{{aߟs[tv;O{a#ta#ET  %[z7w hFb+4V +&z%Dє |4!eG67k B]m0Z \L`ܱj*=IF1L)0C qnK޾yEyGo) (-9o^JLf).f.z5S΍V_-a<#h3B \}q(9"b'S٫}qàxB"W[(2| R;{3Ǎtp53LK/k|/3囵pR+[c m څcDZر60}pד4xC5{&nsLߝxOnxBҾ;@?>\}Jmnj4{~OOSLhԆ=J2:P==Ƿ۶`ݰklJnl;zp^n&ƶ-8r\ƇC.~a$;RڦA4 o; !yrPINmSgpyCARZ0o!hASӸYIuqi1vK K] 2A8FN3YRGRC\vuj,i-iOtƈCiC V}1j{.#ѯTSVlGa:`0<ɽ?M(@S w7,*;x7sTg}ǜ8Wqpw秳_0FFr'DW<0O'x z3F.YN-ym3ZK\U$+,ޱT[.)6!9$j.$+ѽe |/TruUP_ ! ؗoWiTw- Uۯ3sZ(A͐, > 옦1Eqo?x ҥX]o?͵-mZYO&T{Z-e"r]Vmeb nEԖ)^:Rz g- |G%)E>c^6j)oXi N"tYs_j~F0ENλQłYao)B\HfAJbQMG 8RFyWMT)8yo7.ǫ3/}qDEPD%Ie7K 1,ZbM`h_QHJj"0O!RR"ʤǴa=dÍ}TC]9|]@hjfm5Ih_Cl<*yZߗT1V0k>X`1ǡAk#8{T(NэwWhwl Z]U.~_I|;n>d:sw|o OHmA" ;#24t!c1%N'ĖP#D@iڪwu_nx ^# 9^>8e?ΉCdΉǫ{n1hQ0ꉰU罫iVH hqćԬ^W[W-֟Ő K%]#\i35ҧZfuik>E/{ I8A`i ( B"u"^w$v&cSLh K]n@v~\w(H1Pq}7[U{>m8Z*B )`^A 0Ke\jUmz|fw1z陻nn/v[~[\V>FO' {?ٺɲ6B>J"bz/ZjA]HWG ./m 'TAN){tʶ yPS*˵yPQ~:Pk Q`1*9.A"l1D {^ Lr(2|_FԤzWkz<ѹ-sY%L(B0C9쉢 ^+C WSI F )aNb)XQ )ʽd`L*c0Pt"Â`}Q”V VD,ZgN8#k-3_exgV,Ͷ23)2| oWY)SqGiVm WWOaB9.^?>>,b@?@>|Y\}0~͗2>\(-mC/?? lXM\Oq3AO;S=_#Z\ RNM6N+}^n-[3Oќ9*PO"pѺb2urhbݎwu*7_hژ!o|f)ϧR "v 1f(v;$VcحƧh֜"J^!VG3ѴWV1Wc>?-8U=5~cJɝpiHsէii5Ѧ U,?x; 8@>N]⵿到`D!MrܕM %1eGٙuM+ALtSy4?_wsFT OY1RE; &EpfWhڂu 6@&;ƻhKS ĄKB55 |B2RǪzs}u$27lwחg_bzc.VC_\I|dx{eKRa1- ɜ+{q/h~Oފ9LXc_WL\*I> Snʗ*k:򰦳CDtֻ"M> 75&}5c rA;|s],ݻ鼰,9MxŘh+zf"}+ғ.#K-kϨ$07(:nyZ-]ԻFo꾁RCRUKIQOl=44lFj#mKUH)|F ";V.8}ŪQK4ݤ>jO;Ą FVQk7o гWyF 36^tTJSd'[iTل+#T HkLP=2Č2ɗ0Q\Q*+g/~X/zQ(C4q?P)*q6[mW|] JBі|R oza?& 7e(4lZ}8D9Բ`u` Z҇dY 8R@^CiҌl-&h*IH w%A!`p Yg{jHjͭE4[hD9T:-ѿлzYLa ~pn(PT{pEϏFaT2ė_b0g_Rf_ggO;ˢ 98Di_|N| o>~y2?uT[qf)Uw߫ڠƋ2M7۷?M08amԶNQE@2LRKpiI q`4IЇ{r)68 ZxAEpףӲD[]~vW5K@QA4 8ٖen`(/41тq --+Q8̕E[>gN(aםP+Xx*?JWtRAE@ 2:Lf1Ts)Tt2Po uu tM{j:8ADt_@<:%=;-/Zn:7z$9@GO CËF% 9dO귲n$I_c^BN5 oο$/t`+1%:0ۖOB1Rp#FFtRP>RUʠr|DTv^ǐ)I#M ,]IIkf JT I;޽ Z&*jYN%S5kɯ_3b{ʼn"󽏒$Jns@Fp;p"3>1mMqN!Iϝ%@ DlgEGizį(/e};#':[fcՂ=gAhjk/=JpF϶!FItFTyܱVݶ[&:To&Ll)* U5(j0UM,\I$w/:6 Vt+@))M%dX,?J+W熐JFI"gs`~lfrؑ, ɒ)~>”ET[RߟbUUW*S+™3JJJVI_q Sx%y>-|7<k=F'87b qۊ`q]n ܸl[R L$Gn \we]H Ld06UT WM`N-4Pz$J+ylEO,ca١pKJׄd;PY-bb0Gg/M?3 -?짗揻B V/ z-C0hHBPhC4LQƥ[ zW] S9. kf!ʹ{X?r(}c)ksgB?,/won/v[~[\V>,1FO' {?ٺɒA6???Kw IPW$j9X^k}7":] n'LBdD !H kpTX {CCI $J)A l.>f\'Hlap6f1bkCn]{mGK%TB`o%BjA\3bq(A{Y][s7+,SU~HETjx}X\#$'~38bf8GF6S ARpSӓ 1aZX敳i9f>ȼ s3,$f^hiLZ4 vQǓbJi x#H4@"sG l^0LyFT\9-3 vay3ZHp$xa[hkڧ=6ԩ7F*75,2Z:;asQa2<˄֒ZFV9G ٜ{ڍpZ)/ hm#F6] j%6sCd@VLqU0c%:)`*Rʞ "XXE`¦{c@Wg^F) om!y3@ކ6{_s1SɈ&dq`@bftCE( $ Z01BHnMp6ԑPKD4-4\÷uʔ$YA)@锩wR*A$#lv$ ,(.A 4%kϣL Ôi$5<r:e% Fh)SGӗh,lOgܧl2 6}ȕƱٌ`4=!/٫]ts72 OA xh0pVey&t9B'oA6dcN5&3/Z@q8J#%e*T^Pҵ wѷ1=I*%) Ze&4.wD3Wkc r&HvƙT m]-} N2)#D%+wU{ǥ!^gbH1h'p23DK)jlU;\Nl&!; +Lɘ1$A ^6k0UfoVg澽M%+Ǜ +ic R_{ï_;? h[- z9 ^AL'oO ne<.._ےVsJr?_~_–\ތoGwp͠|#?ujM생t>sbnK}FQrB*UjFr$[MLq: 8jm1ԅq|t-?܃]o#|" ƒ!9>n !A3#Jgׯ9l8zZ>0!Љ*/4 #tO!V IJ=5ǞP ҙ=DwfR@}&҇ugpQ̖;ߏۧѽ3 eLt8h9}ST hԓi ptp>G8tj >":ՄGBCM--i$h:lڙ'7>p'U >Oh8/ì! 2A`,q֫s:?H2t5qܲjSW8ET}j2z'{ú]O^]=; 5}AW JMaF%m‚:*RlX?׌}*VuwXEQ!zjCN|N^]]1Sx5d/O7ZxgdgC; 9U[B%68 YF$"M1;EZcl{Q6Hgyj`aՃ X&`i$wI9"dÚRpPd_:Ӈ'UgzGfר~_ȫG=:NkCe̺Cu&fpo5Hei`}VSۧYMS#5O ͵!H D\m+zBGG "fUۧѽAc=iMN$Z&C{JI7/'$r!E{.졖_>b(<`̯bΞ[d |b ߴkP{xr[&=̿ hq幽v_ٵw_%;q/@>C^-4I1 uNd[Z.{7ff!|z- }+z>CQ55vz /2.pEF E._ϱ4|'#:Ǡ#:bm'fxM{AдD[v-݆ mvbIy{Qrt>5(JW{n={m~nLI TnK)2_I٠0Ԯ"5!`XVz$A闈JBh߱lZ8D~Dm[*INdtJiPv:P%4(?!rH"Ϥa&c43&sgB eSMRiE"Cufhy}z_8Z9=kIRWMǽ4jvM$5i˓W8\mہ5ڰ?N̆[B Zeե,;=uڠfx%;l#.yU[;Hji5}ot8$q1k-X'9Xof8`Qn0 fq N/603IRf,ؕ&>]Kūx?\RW|*ŵ!L~D(L"Z&bG'tˢcU qE a$dzs, 356 ețеlUћbEwO5MqԨڙMH5z U慥L_mؒ`=T Wnҕ k;תͭ_:24WҗzbChB\U&yGjd֔Ar(OQY O}S]S 3xKhCT7vvȱv$~08go(201|yyf8B.Nu.XL99TqCs*bʲaKR{"q!-+쫖(O%BtHzptJ;JuY*QoR9m?ܶ"+=Eˁmi̶4=G_Ө)gVRJ}mYQPW-6 Wi ݳB%Z&Qk¬q[W]^(.ETм|??l~F5*ڮT6L5ͫgc6r@|-VhӜ_ƕZYQڴ(8{j;ЖL I@#J/qƳ"Sߍ 0(u4닫Ofz5eWW_{WwX(s%-c\6W3~  ^A/gϿ3?=!ķ~|!*.}2<3N3@R8Qˑ‚X*x&+m jp*uV/z5XbUxEG^`_~PWy".&,ٶ#[H/?ϛos<} m`na:!!˕hPl>c2. ^++ ^ 3S202#ouG3XrZ8hט$[E]g]{[Yk$ oʔǀjx=lg⏋qt]QyMkGP\EY%&mlɪGwλۧ3?,3kJQWvr e4CB) 7ĝ$2snC$XJ!ʖHxXf'9^48wo#ˏ@3B=p^y4BײQ1sXukFw@.ħEtN8jUpIV1iS?FLULoO~DG"/1LȘzʽ'QeTpœ\ B1ZΌy.eXG6 ϸg`SW&/l_3E^P-:F 2b&V8>ʔ>lUgZsxsIXFxK9BևH%X J}NrU  Qk >E2 kM6{q<}xK(mR"P>ˍMWx ZeX?׌엗羬Çf$20CX搚Du&7-=ܮ+ث+,S؛Rc̥Jsؐ s 5FA2!Q\ۛbi)JAZq)"]܎5#M3kQаF2 ,07́ϴ55IUÙq2c+WqQ٨kEO+2&z\md5SKs?{ȍe/EP7o8ۃI xm{$w3߾$%$b)htVuyQ9yT ui !1T[t Z6lL˯mߜ j{M@\kCA$ЀO:=D657bW. ,̚ ʘEI2BeRtm n\F[ M2sAaZHZb>wJ3NV(ZRV(?E*EUIhA]LT J許BVVYI5)' 'JrMYM W%IӫDWlkJjzdV#mP?g殌oo1Mf3RZwj%2iV SRi) jS袓hPd@gE#8j̇F4 &T@էZoz@Du;n,R;bT]U</gyk`u` 8@~2eHY)U&Zsl=^Osn!7lTO9Q}a AlCr(\,O+DMB'Fs֓ehpCa4< \v'(@$z,"H=$p2٦#A '@pn3Q+fs7/+}+yC nY*zͻskhQ-s:Y{=;]mbe~=ґQvgc-7$KD3x20}ڛM~D뭺4fڊO'YvmJR3Zq 3Z2X^FTopw]t_=ߺ>Կ[n~?y9\zR,u1O^qMJ@^?7߽~>0rE|?]bqwt=@e,on6^ H;MY(k)ߞ|0W3S9ܛmr{ʓW#pTq_^\YbMw~8r%[QPdT\+M @DAژL(:`,jJ]ܿC=M9FVl=^2j=GbqF1uQ3@^]}n?nj2zhepVF6w愍A>%Qb3l#C?n4ꇹtׅɘCmA[3Hzn++w׽ݺ-8pllxQ2(욞C|'~b6cGNA1ޓngul cGNʶ.읷&Ojko1ݵ0O_6QM?e$]J}zl-sjKpJz i1}.U_2GX*'}$498F/y1fh5/1A]zfE1x 8c-^m"A1e$AkI[&5 ;T'?\ST-a (VXԆӂJTȅ%raq'DM0F /R 56=7ܐA9/\rRSRiJ[@%֮<.!gR DAuv]oY;LmD_I?F A~  5OUw @պ|T@պN8TTǍuᏏZʶ&"NDإxPWq2BAKȭyPM!T+eͦ8&BNtLv T?SmHV*QǛv}x=]QE @8 5]{q_K1Y%]+-/el+?kT[z:{<ܶ('דy|?徘<<>)oQӏ d{,# /Wn<S+Kvo[iq-\ӿ^= 7YNۗgٔ|= T.eZj!gD+CvK #& V9Izw[-̛hBd1]p$HpO9-,\Ԙq 1a-RD*vX^(k9CըѼSէ}ܮ f֥7NJꏈ#1[q#g, 3$G8`Md)Su}K'8Old^Ng_w!-SBntݸCˮ|.}vؕХnFn83ƞFatn]z FHg}ep'ؕe/ vNhC'iOlT,8 _˟^{}8VY]a(APlu(5~ R)ŧU1YhI V[U)8-0*5VFZ!+$ۭ"BJn4AT&aᄱSƬ楢yQ!D5QZptݣ~rFXA 'b'7N<%j7IݞP7 D^jRdX*SVhM * Bt\ȄST r|p;3h1N7?}7[Q!1RANGP`q*j^ u'B[h!gD㶳fz -z6Ez޲[rEC[-̛mJV6"X8jy87o BIE-OEz=7! 8$ss\O&ש[n[q1=ی-&l&hmeq"#"N¡D5)8.!Tʬe&)JZ c֙V QgHXh9$TsJ\0,VU9%N8@&}'O!IQp]37q?Ub06> B-+^w2D*)v'~ڏCh>+sR/v /ऽKt#09 :^dO41^6xH j4g, ?q# 蠖gSZŭ|>Sֶ[e.Aڷ, X9QVN-=.m 6~M;@V=o"Oqb]L;Үϟ5o)ahM}W{7]|1Wϻ?ϯ7[o}#7U nUh]h$hnr%Ƶ0,J]O>?,_?NnA1]}tq~|ї*wE͏oDŽ2 ll2]>j>a9>ṷE8:\4Jf"0f\7T`sZDjK >NJ8 /MQ.Qk Cʪ@hԌ wJ(*)I |e7_d)Q4izy<Ξ~[VN|  RL1j3fʀI!Z5]p]O?|1w|R‡սnrÎ_|H *O+O(Z=֫J8()dWL*fP\VU+B)U\S*˼X9S, D6|}^|T*X]jBY4J*R#X#BhPkIk]Seg[!ypY 4 dUZ]=)^sX@ <~ӧ"",)HXG.7߽^6]7_\|hMKY]u_?~;uyEȟ_@z>%O;fJj,;+6Ķ D o#J Sx+ j`ȳe(RjE"ׅte\]ݸn &.+TD=4gDQUE؂[S )(EAP?Ds鞕!dо _F5IK?ZRƤںb*JC S_H']'=fM@wdl(v S5j_#\ǬSRCJ86꓌ l3U֣8qvt«yd/B ]scV.0dmPqvp<,6oVKN;fh9&י#.fL?fqQCHlWúZcĔ)3i~ʕ)l bt.W[ǖ 7"@@#koÁ}C5[dˤwC1Mc(F֟Yc`սFKfJ\⨞͹oíz~"%T*g]9gFP'0. 퓴vjؒ\JaJEpJtA21vj[Nd87dbrP>-ed㭋qq8Of_R|NLs[e^.ϔ['|4A k{Վj]S \4PJ1M "S\I=<~1Y^߃BݳJoomXq UfT@kn[&F(rWDd5PTEF@~UE"̙,0[1ℶ9,Te++XHW{,L.@J D:fH@HAҜRiP1r'mPVU1oyY a^UyIQ=M)"`*.B| FV7mG_I)qk" pyvlW8SAQ+Bx$b[(qtaR6wx L0P< D3l]{"~9>,EY[ JZ:6DM KZm஺i s<߱ԏLFSKfG>yÉ#}ݥğ+Cv8S缭GivEWPg#q ]ԔȱzpzlZ$(g@IuNqTOMV:soX#2>oՌiu:mWAmMpr!ʡ&5KPNt<0sX-)WdId@q5f@'tF 3B3ט8Q(KZ9/rYY6ZzqT3\;Έp%Hy@ Jmla%7$":ATqjI𤆓/yMH:Lx/<* ϼw4o*LZdCm8b@[Ll f_rP^vsnEbk4\e\*lk,)/><||t4-0 )GGz<]h׳ F&ћ`DoF:(.Bm Az9kg*%#p_Mj4GfՋ-GV,^_ɮO뫧O6{[(F+r#E•Q%j4cI)ÝvQoQ!^جMB9_9)ĈS걽CPiW4iB5ʩLȉHlQ+~R:a\_Kor u.xQh(̙?2:9<U.Ih) '9txa4) +Iœ$̅{*t5:`k=)6%FAt?Sы^lm<,}THX85) 4!Hs}R[%R09@Qw qf q( h(O@Kp<9~ |.dIO9`ՖI!Hq%OӲ4M5ȉbpXIrynxe,'U`j$%uet(*5eZq}& L .o$d2aJo7Zm7bSBm&Xl%/(Vm=/p/1R?Ieth/ [LK6E,!0ѣO%Fd}%ln?vw>wenk(bn5>,_/ff\mPlc"q=fdԣ'oS^9;EPʣXG^Šv'XVGl]q{֝ӓ @5]ov$5nE7`Ơ(a(kh3P4W-Iĺh`u:PLӁˮx[r5YUĥg7PϚ~[v8Bh:cǗÌ[CUsjo'P{4JQu6*%Һ;%cqÕkOsvzjWz5+ûx.Ҽ4n)|&c/^DȾ"EuDVceUSz;nc8` dY{g`=JosQjdpwVGWgmWMYɜgLxjT3o,R=&H% ~|8SD*;EdSA.:qY͠cl m>Yna ,1ʿPgX'`h6XP]#IP x:wmf&Bwp%on9mZ$pD VzWY۵ )dR$MBa&M+ŋ- 5= mٵ{vj"s*FGG~ +n!Y#Hx.5LIy,|X$Jj5X*9ౡѪgz K9h*MHi*>\*R˼PvSE!}Fws8qQ~oHd¥VPyeOKr"H͑>jĈN5RpSE{nͫ#[r"DD5=sǶvǠݚb#:MHn%Cv@ڭy~vkCB\D?Ooi7J+}nM1khc$iԥHֆ-S~,?ݺ߂2+@&e]Lq2B '=r8aDӐ& q'O#}C3񿸻/.bŽ5ψu-2(1H~kGD!_c3#Gt`xO Z_Zg"CU;kVF2g\w٫怮&) =X;1}jZ L}w\^,*(OZ_*q%RReT/Jxb6>PF k˩x du 1eX/~.䲟i3~ ZVzj_9 )PF4!fYɊ}|w(ߜ}"Q@XcĐ |$qƈ)||"5nGnHsEڗ"zE}~I}6tt?{g`=-N(Șܑc]S%P =.n״Ou4vr7/Mlp~~jop-oxb8@c❡@||wgPu0iAî>8 x|__VK =lz$[}3M4Q=uۉ'Cለz(Cx-3hAi䖊;LJ`*1Sez&+%j;˙T^`&~7oj5~f/GFMy !y?-cb52RhmqƔ ɀkp2w~V 8ʉUrP&Ҭ޷lMRX6C&2ohO\F^ r\q~z5cC;vO%E[VK~uI4 :yG`>ۙq3+( ayy  ׸&bI3DJe 癮A9{/ꡫ\ĬP{M< W9?4Q(X$,M?GbSSFe?ġ|_DEg\侞;χ= %#ًeM$rI4}xJ2a%D;K_//wA&*K־B\wGؗz?Z;y v*o'CFKe4ӎR QC aN*Me2ˢ,Z02+ O>cakIӅOVpQ[߭Z=~mnƋ-(%X`1:'斠RS%(JcU4(pj( PѪDd9ٞ˙R(nMn 85ܼB*&WͿ|~j:x}չzTׇͅyzb淛Y+7WwOOʧ;w~,2@?W΃/^ _^_>jq%0y:d?sM3+t?ͳ )ӓ/ 0!\jZZC VE rr`: Wk)1„1 T^uyR;^yVǹ-+ɖJh)WZZ9G]zDc T9NJOA-5CHi,%U0^XY4BP}Z͊n7SLPMoIJ9F)/BEXr94Dl"Ra.gxRhy8 4"3+0ȍ?gp3IK0~ h4HZ8+KQ(Ey&yuNQM<x!^ ~C  q2Zl2*.ABHo?Ozyݿ<$S_m=(AJ8M5 uB1 .B4#vO1w],yᢚ I9X;RpFNRv <?_kR^IzWUS3{lD4v3(^Cz^MevSc (K!!.;s|i+6+)R0YkY8X9|J9f͈s/*[ ?VR*(ބtyO t%Fhl~ LY>o| yS Юt0ns0© \˼kB# ox.膧A"gA;^謘HSׂ#i8jȢt@ $NCJY_/?-@$'PIr3*UK^7S>Xbw>$,MrE,@cOKn2,Ki2)XamwpڋUywSLŖ4S}?ӎ5x-=55rHM=%HRQ8/WB l͈g`f$eDRgh ҭwd6А4x?;#T`Wr ͅn302)=2BV"R Rhc.+I<@xM!t՟0.r9Rhy][E-];=o&!y'!Duk"+qBw׃:5sYFjK1fTCC<~dGbn"@btS'фXcL@$;Hk8`Y$l hbErP+8 J'A+v1ix2}&YbLZYpRgS%J]mº0"fѼǕ41fϿE+atDPBA|OrN+ yn c!pnҼ1MbÎ5N!p p_f RS;8QM%+H9[Ъ7|XKB\1<(1[%阝Ht_)z Dg1y2̄s<XRޢ](8D~4XYH!Lb&~b]^KlI R6mY땔!@Sh^땽* zWMl$2?}E$b߱ƩCB r#z#"c J g5V{ _j}/Qx!zW$G8?gCýYJ zHUP}&@ӱ[(3yqѷPrnPyՀ'%2pm<#xvYXm^.G̲K8_߱v3WX W,wzt@X6̀}onĬQˍn.!=k(ޣe)ضr͹bՐ9s/+1SF֌0 $Sl+8Ε;cܙB _A@0'=X ;(}3zW;10˞=  <س9;x|Z֪.+s(K^~E,&+>,yR2[eOrDD# &C"@2>hEi{~&:!fQtʙY7S1.Ete f=1G \d]+yzgaK!= Z5*0!FT$UȘ$aI BbɸRU'},ޖ,=^Z\%Z60XH:i |4ftɳtmF;ލuNٚw%IHR4djXc:e\d AcBRb69-hs_#dm @?y5X :) kkxEZ;D2_?P`YD3^<=s.@OWxW,0_ׯn~T{`ZU~Ca?|g1#RWقoω#-#F(9ww 3 ~ h+``Esчӿ昃#/Z+fP G®qKĔGwMS!<6TS_/>mR< Jzf;z=NkN3=IG!Ht=KW~Laΰl] 0}w=pqL:2GoίM.YG}?p'ٲ)/JLZ, 2ʡUx*}ӏ~V,;o49qkbĄdw, "=H֌0s1m4ၹ C 'DV b0y@H,✌Z Ȍ= Vջ`8 m I$r?Xl\B,qHM ${YoD;Xk^[,@ov/|,3\}3sFKiupEudKs]8'Dn@8Y\"ԯVCrJ:I9}k%=o>}{4?=*2BPOӣNy-hV=:N'd1II7,>xZ.}Cr\gg Z2*g"x/eN[f-$[?cQ?0[g}ம/fbSTrM3hI0ow5}1\rd|3󹾩z*^2'Q2;nLrS&9['T^IVw6VtU5c5vB#yBD09ql">,5rKQLgua7 ڮxrM߾ݚEr!b ΝW{8#wrFQq贴p1 )ʐBNfesJ*贱2)BNi+Hfh:DD֞s2IZfsU9H޷!JЫ) %$< +Hϣ g*k=gjG"W-=0WЪR(Wb SOS*,dC },IH-,9Xy><60m0KS옘Р"1WI2syC}流i{vUfpy["hSX+bD$)q8LB@[npNJkyEII| 3%*i(I m-b7zU;*wؒ @1c@ٵbFۡ\5q*M98[.VY 25Lr,- $cB2bv流3ZA̴C=v>#?VZy#E :Z3lZGN1M۬8I7zmqR܀ۋ ^JQeq3 LPDmc:mkX:.ނ\R+R {hR޽9Jۋ`h^v@NjĀ>û-bl3EQ١c7PkC_."ߛ8/"=.@)6toͬ^28uaRZp0Cstýa-luc|rKkvl}>´>аeJιBZ-3MɺOO};6( ޓŶj= ==01m-WKUi!kD@?ݣin!Ovk/N*ǤϮ= ZZWh -#h^CZJ4ZJttKu{TKjZQY47TwXf-^r#^ xq Gy=xꉶ1FFCtT]} 9L {t䘦q\uoG/[x5ߎKkce[(/iZl aOgWUn,u@??otpKwyWKOmi.j_$!_v)Yvux80 ڭ+MDm3ZTxݚvn7n}HW.]dJwV%cժ|@r@Qdk_UXnuح[ i-x{9J4&>^K`˔L,L=FؚYP(>>̃튍o~}K Mw'-X[U!]eMG> OC۬.jE3LY4˫?<J?뺰 Xx~}kĝORL2lǭQ"<*kĄLi%ulߟm1?G%f|GQHW}:'tuOZ!HOm=8蹛#mH=f2E t<֏G'k qud;jQM^׊ ȅvֳ`sb Χ}!r*mkCݳp_~9Tz| ;T8C|.ObTZs?+!Wr(q-xB sQ%A-j7@}Kۻ| \L<ܣsB_hAd"<*ֵh>y iF6@WEӤq9W!$D37LcYUڋ`C0s-|kc#m=4K,98"{2U5}-ٸ,=^(2%sIdi }g@!n}OaT`g;j֛Gey0!foe0H*J = -rs*j%5(QItqhdfKW2e 9Т악yEB˚s(jv\Ŀ)y,J1#nUu9^jAhs ? !@X7 mW=z5J CZt{Fxc!b8ȧj)Q#o_ӘjC^jB Lm拾q9Ӌz͜E]F7-g9'r; 4Z~,m89ІGǩ T@VGɻA `պUornڊ-A|&;]bbc2>g(R]= ƀBwEʒ[1h$&1t EP+CaΦ7^. {4(fzw2]Z:.3{xgъr)&3B$N4VűLMcb!i01 XeQGC|D/%>3Ж} e8 -QX+GVВg]i*AJ\ ILA( !-/},![oIΆJ$ˀ[T T!@Q*KtHŅap]OΆ)q7ShaK6¶{2lHߑ"Eke7&:s}ŇUwdu{q< K7w=z6 plۇ݃{\LM 8Gs׆ߍճnS7ÁdQ%ywش6: n$v _Napdv-N105Kp8,CFDD@A]Z -8dNpP߿wkP",5hsǨ߮DVcCaqy;-Y{yvtx9ۺ*S^ۇk3/s]I?g_#o޳H}ƪЅ-j:ݠ)ҦP6oc:97{XjB>o:pE&yj=sVҏvr~n:pMCrrm!LPb:h#z'Z3GJY4n TIϟ| 4Ǫ7GMmN8+WBrNC¨[Г7otٰhÇF}h:Kofw,/j?|W|sb&-j5^}lofÌA Th/"A\0>MFFeq3i PD@-[†ӪF$HGOL$qnpy=Vg<:%A0yR+ŷ@o>CJa J3Ym$c0O(XJyXzL8y7w|+| jlgV@Gl'd[?û7.E&e&gJ`}/"'p +C)~#$Ugkofݴq.)+(YAPrTJJ tg5U&,>q׺t3 O[8 8?OX cQBZi#ALb(!{}D2'@s3ÎRs2 LBe$jC1<;/Lbғj@Sg#Ȕq^_C^̃O)LYLE0u#$1 x\ay1MMRI8WwEbp)|ԋ1 4fM[af S 4 gs ~{z%֎nl݊ -]բlX -gfT;d%ģɴy<S)J=JZBR{'(vϡnASX܈!L <1ꜳ ᦣ[lkcr[5HG!#A\igmWS0:/dO?Ŝ`!v 6wjw90(aPqpvk }SͽdlzCABȷla[\-{ w:p8GeX=tECv:CClKl?W1=] oX){M}(y4373@25/g2375L1-f-ZFacu(C֛vӑ5ڗ~f}C} ?[xF@9G 6yK6|G9eBȮ)Vri][xs28'C驂ZaӁWt&_g5 [)tmdlDbH/z7=x:ADEz̷ Ug;@(tb:>9)f5D)8,/LN!#Dy)8+Q` %#(;Sbn.O 9V ?Xe)RST0 .geɌ|Sl'CqrӓGG{õ6RJI\KDIsLZ**"+h-5JBR H9p^w4B}=j <hּGnr"&Yd Y4\B>|ԐA);r&nu y&0k1w2yIrS>.occ/0[YC(殣'>S C$0Bne(]P;v*,+0)b[3J(+4PY>tP8 N&ի|7 4*a]iǕ6q{UbGchxsXH:e1Br P}4;G2)P=> ,O2d>‚ťK"PZ5$9u?1@UȄLZ0WR BOүq~#u4I] ]f.oE.f e{ncm߾~oүLEKW\Sڨ]{M{|ν tVtLzzcNs_{r&iF,݈Y6~a YVh+qn7L?T??FHOOW)7Q[U=,{Ow7\~VaEs4n{گY\2]U}o}a~3!~.hM;QZ 5N؞f.i ,ݑ P_;9pEa@Sx d9x!Y'& 'TA]伛/ɯYk-ۻAg#.zkVL$=zۺzr$)'!ck@NW.zUrq1j$W^Kǘ4À/O5k VBk @ MHYpXI ham6Rhw0b iD$HXKP d]%Ur9)|'R xR02UQUg{1֯(#YyZޫ jzy8}@9C*L\Q9ȞMtaț",J--3$aAl' ZcJM6>Gl?#ǔu`5'@ xU]I_*D@)(K Y&,]Z;RQ5unfv A*IS, qBN"ڮ 4uq  q4 ,aXɫ8efM@~̂@A?ZXAXOX2/-yZ)g3xs@='UP̀ `ҧlޤ*s { @&AVF̂($뒨2v`,ٞ&$Q] &)ݔTWk՘1em*qbϗN$&N+}%)EHK)jX4$q@K-U)N"&h.ѿw*󵶆%jqIN76&M3v~/bB -dsM #P5v9K+ ;,XYrߩOW0ϝ7w_hR)KLEIlw(Nb>SIRmh|8#Wv J)C.jz7m{;I'fMNS`_kNZ9+6SDZ9 ҝ< n L)̡XkٗR/iۻR/Izt?_ޥiYjkt}mIVE=|>:ާՋ!5bXGW`=QFdRcn{>v0dDZau:1mYTXo2`!/D3lJVv\Nbb:}tnNj@+ 9wOIn}X 7,Rjqw@r1H1>x#}Nn'~ޭ y&eSTӻM:AŻsDk6w Nn}X 7m ha![FԖ#$0i p9mNdpɕ6Nű.b֗X %=1*mDQ}2 P])qjE#o; 5&u,u=RLÈ. 8ԈyZ0G'#CߥAIRRa^j[#/%''68Q#E_&j nG<Ғ \NfjtbbSB+HBFQOyz$}!1CJJ8&::]ڎ2=0uK*8|).<-xp^Bc '5@5WG>\umdV?$ $\#4q#uWB$۟8;%9FE("x7.jDRCR o7Y}|,Xt["4a'@GrˀM)QI57kSRÚ 0݆@PUhDr=8/9r(bחV`K_LV\cDlyjB Al* }-EM}~^@lA,EC&ށ>Ǣ;޵Rğӻj8ӻj8u 1,TJqłENEd=JY  L>T닷U2)ynG/֩@`+Ng.-D+ƛ0i i.V;E =rtsgu!7롓S}.K"&j f}tY!u>ʼncD5eDY*]OT tS楷kTI*AZA ,34 0hY((pRhY.-,h^]ܡIgC&-EPM,\hT9hI = R6vNv׋f"Pڢux/o>˝*}.֠0"SzmfX&"kgbܼX*JR0{RcXEĢ͋= 2RiĐ/:FO\8V>}^*zUe=+dSW̘#db>!WꩫB񮬻 $ŴqjRLǕ-tewe3蠷3{_p;ha,w ceל'd#ggɐMUךz8@k9>}^;G h##gQi2P=9GxK^K >!{ ](Fhv>^ogv{.WLHIq^3o n%aJ1Mt,0qEoh)xT(:8Ub@J"PPks!-k4ѶLR^qdhҁr_Vxi]֗)~ayвtZxǐxϨ/)AA@M%#,_Vw~r Ɂ@џYSS+OBlʿ= yBVһR&z RG(aBG,c֛ QE.&PޟXBDTE I>KIp=<^0 [^JY ($ZAJ-K &0$`C3\f{bԌ9-_hLs>* }TRci MzwQ%`YAxwtX(DKnTxH:5܆R*l>"=xk>+E!T)*#qXAd)E{)(K_,6O~x& ‹2O$^Ivq?/me/yYy=9~ﯯMUZ u|?Q蕢J .Wߧ7FMDt(A)#Z#,=&MˎQvL_oÔ44|7Y؂"a#{>ަ%ǔ!jHs>EUҚ{"1N~~5}\ۮە ﴇm9ĸa~~vyX'רCk@7ߪ3>^_m;ֶڣGVR#pcu`6 G?n鈉ND_qA^vt;iT_ys٠}=(u ,λx[24ʺwOL.I9 ƟT;zpRtA90SƓk#JS o59RC1QvAd!p} jI˹RǷߏ !-rAIJ{xZ;Fy_/3 T{zG6ɽ^.Y\{ O`umuk'se^}9&N G<Q]ؒYӦJL;v̠Z1BYM`NsPeÎ{ vM.o3\gO?/I+ANL+@ĔTpzbz1yUgA\מNNTGY="}8IE-}z$Q J>ϲsUS<8`|8UJOJEEQb8^sh :z9`+=\(797n#E/[{p?l8{۽LepI@kW}d.R @/Je^E6ht_'(K\ɭ M$@1edf*V#a51,u eY#rEu p 3)wɥD̺lABB+Kn€ٴQ)ps5D 7Pד @J DN(Lb܎!@)YFb$u۩*IR$cp%`Np#NTo} T` F lj (Ld ZS]H ) k2Fǎ_K%sx%ޭܾ\Ms^]J,bnB#AXhƘBR aIR PJab٪0˓ت[.>XmY1h gT4irM|oHLjB$LQn&KbDi R)tJk10a6j+MѬJXz|Z8?c%FCIar9m&ChS#詑K;dUۡRp.l();a7$ wrK !H |C&J'rLv- pX'>ّZxtm'ΎOdbZGs^\?/sdTz˯\~9fOS=8V?Ecply?C9($w/tn"MK/ktU[Rb1٤ O-z SrkSeA֐q) ?x7b11gTn}Fs[<hwkB޸ַ)I]  AF  d,͎Dcj1wT w=l)Qߩ$g Z tx6 |~kxs,hp@񉃖ŝxF6fGq]ϑd:&>Ж1i71 8'y{WJ,'W#@شc=@[&=\i8 ]G((1nvRʺqja[i?}48M+V%H|122g\r}od]Tgpg r:@b DZiVԇA{2x=Vf̸Fdlز#t>ylx_o9dL1DmsxZ>o`k/\Xz.}sG7YE[010m!nվ~tℰXV7,՝k\XVhT(;2yU睫rkC=~!hh{fv^k;"t? $dImi+灐ٰךYv*?V,hݸ?\媎ADo[$TpIJ\(jS)41n֚ őXl#>ѽϧ+楹$E"hxU9lNt--H't.sS*EG@NѠ/wJl ͍/wP8^F9]wĞ@po0޷ 0zZgoK=M* 8@{HVTLqQýfµEdsvhRm*A1 xWP7≁@Mo ~B^wtہjPQr:PE8N<*}A%@f㠒ucFt(a(Z*A'ŵkc(5LPd q:0Lb0QHFH $Q2e *Ncp ? c,xs1Юߗ.aQHh;TN=KT5r6EMsB!lx`ϻ51wRoߪw 5hxZ iK!7[2͹)fūg!JRڑw= SWm! R2 h,qS dlNY[j =͛mhm#1Jm{[v:V^h,䍛hM!P.F=zX BL'U[Nv-~>л5a!oDclAw/%w:Ung[M4Ʀmww9znN3x>H0ng[M4ʦv'rJM)=ќh S;LΌs> Sێe/(ι9kr$Op/y9Ohb$ ! x<'41Wq2yy9Ohb$ptI9O8 N0O>]9O8  } sp (y9Ohb$P*O/Osp 'GS1xBK{`3~o˂۩Jk_,p}S) ٘`"*3xk g4;&|!.LS)6 5,|I7Q\޶oZfWGb4tzCECF Kpѡ"G!:a7{Hs(>ʡ(EZgy/beGyn{v[Ps ڂ-ڲsVuiKS]I[`|K@k|Nh5S{_~2{#9*A$H6ϽU#Da=29da5 9"jȎl, 5no~ZC{!z^؇ |G_Tr sG!?z=Rd40E-ӻU#nH~ꈾ/Ij^9Nbi2_TwTwTwTwnuJe"IcA`FJ2IR!3RQ G>r+ƝmO.m4?>jb#lsi>4CAGbN84Y %!cU; ZvB}Un !Q6N!O(M폍DƘI 3(J@ĊJ#hEfZil`FLT i]"N V63I&\1We1vEd&o]Yv9 &|8E̷zp z?㛗Q.o͇/ -Mw{;G{ @l ~}54 urǧN S8pg˯VMCBۧ^,$o"KIIrQLǒr&@^6%hcX@#1^_V}9r3It h'@q%b}53 OrӸ]wgث GzRkڣZ]_{VwInwn& n/O$(3vSjaH ɦPH3A І$2DIg;N0-i7dG_vԳŽZ92&?g^_XsMi}!9 ry_;^1)#qܻ.pB v;Vv2#IubMgc{U J3\P̻vuz.ӐT D5P#"DI,4vcNHHEnW` i4&KJDPz 9,~vls e- &,v+D]Kro $? sCXRy+,C  ]"iؠo>d\0/72Q,w֦m qtUm`zQ7PZolU<ӣI M{⁁mw}=m˱Ol}M *wDOvLxEs=N|E?xoDZCYAu_'\(/$pc"H绫&EkLX>hA(5K&MzQPI3 ˅R#== ǐd%2(F( ~QL+%LSkOLa!T-Z^)EI ҎB )u* hĴp$[{d1ha¶zix#y^ߵ٘i3ث:P洓$ĆJ$KH*Ɉ*@l(!Rʤ"I1@q"J8Bjji20+.kj;7s{0-ya~aѳ_X q8<o]Ϥ%Dd~>fGjv5 gTichu:lJۀq)!#!w C6U[Yթ8Ei7nmsqBIgPVm ^O <|||0i}T.~ï G{Qj{px8z;*04L'Zhz=8p(:]ALj{Ą/kLEGա;y 1(UM3߯JdmZZYɢΆ13JQW5JchL0c[psdiNƳ[x.in.FO'%+ ]{596n27骖uj;VmʎٝQݗKfH9SUթ2S&3v&s|bB2hY* `NdIfҔjT%ZJk{Ͻ Z2ҡ5 Y,=`ȩ<erQ|\s1Vj6"RKbo'v'ѥ)2 g6ܾDZ%r* 6 #}h#ۇ"o6t~/7f<]"ӭǻA x]f4u-D6w?٨m,M6A"Ι{s*3=zOf9R:n{mx߯ с`AS3N/60%%X+#a C;?QP2ڴXLЉ'G럄*U« W;$8D&l$/4ID'[*(| VcWb^>Ey3_~Kj~N9^,IGI3ۈ,_/`==+N3%⒘ +9aĖ"؂Pn)Ԛ`!PXrQ|qt{dN &|W!ŀ[=p#%0yێ/z61[,AF!Tbv4B1r.)ROOdPF$5XX/fKb% U`&| ZCwW֫ѝ3p; ZQE}Zsq*1.~yG ŭut6w4tҶ[xB=$+Kti kC:{X [$D9a'UWSL<_ONj`ty+2{v9Q)͕щ3ݮ-Llmb݄|)A95nm!KK )A aUPι1H]8ШTVs /IVU8Hcr5r贖CPDeZ-~%Sp_/NRD Wj J$R"n/fB4'2$7,7޾ l~?-@@ëyҭhFd<H`EZ zNZA;\Z$A[I ;bRH(BH!zѝ4\ h~HN3HV,)҆NE0YOLeM{; 9}PᬉR%؂j ѭ?p: K|Ei }y[{s?:44xg1,ƣ|vQQI̝oՇh<-ihS|.ĭ{B_q]Frlt NF#Q8*^ŝֵdD wǺfqK!{ƭ] QµX({Nn }t]T<9ɿI[t b+g)b I+,{v#j\ RD'uӨj-?8Q5!!\DdJxɎȠZ)"f?;2 W{͇Pq8:R@ `~аJ& SF ,gTO^dNC33Bɷ fȣ2[ulMVbKy[3- ⍮iU/ W\-gk%[{Ea*G[&2jE~s03KTiGJ.WH!{*=2m}O#$ܤp7 '4OFh3 q 7_AnPG &,u.kxEyJegʉMH+,==lh7-Qb":﨣ݎxGmkڭ y"zL) {,LڇDntfܶPmJWCD衲z_8} ^*!P8 j0GX pčK68iHL`~;mCZb[3a 7 eD ›m,LNwC-ĮSJfSXCC,Fۍ_P/пct# 3A1ۈd –JYbu*kETwoSM $FϗI: -~MRzirA,q8oG_B?ނ8l]Q+&vy-=E(/j:] %\4 0M9z܋L-SR(It8$ .lY$?` Ho0-CR& 4eZ޾@"c - p"JRcP(̈$R"n/f0k)'{|ׂk[Sڸ:59pQrbpIZ8cWԖEI0NZe)% rĀ)Qaxhen̄X#ˇS5R3y+7%d {?ΏKU \RRv.=ԕh ˆVQ^\=V2kmq+%+%ʂ3` Acj"z>El?k|3KӆoT-}/vvi\\ U;_Eh]AL:=!fkH& N:#ޡ58JFשfer1HjnZgJw/ y"zLI =>:kmwT[]k밐J bAͭdZͭkpK$d9A֬$[g X9*&B5^SYKsż؛` AyX J[;,@>nD jDEMb3Zt84U"Q͉z3-uq_ H[˫[Ҹ'X-}3 .훓j򰹾[ɏ 3k.~Qq1EmR2&ӆ'nS^(L>M|J91Ж+@P ˆ7-͞0H;DjDs{%H@Kd3C)h뇣)ާ"1`ܛ|<̘.+%i O+1j}Iyᚄ"$T)od uTDwQ#!i*P:eB ,([ 9MɦI 9J%ڪFd Z;/+%1@+F]CNY9F "I@D%f!F/lF aNa(5HRp%YAYVyUޖ_PS[ئKg>z`gJu/zjwA1㇧Ϣlqs =q?!??d:xtݙկBկ;c=e92?myQۃw9_6K<{A54VEEI x! >N\دf??RDjw|xAoOƣ.:,0*j.m<"n?v=Xjd_~* i:y,)JJD:aoȫ0(<#7ẖtc_ûg2? 1B'bҖ`<$ai*-Aƣ Ck*?j o3ړ? ڄ|*[;h^Bȷwk;w#1uǟ|;t1T0׺*Z1w})QkoXEȅw3٢Pg m,G ⪙))1rQ}Jfr,o&@Ψ{ouD я0t[$ [ ڊ=]>Fc`+_l\؆Ip6KOi =70X0%v5VeNܙg"L\fd3.؊=Ft;oL ("R;;G*IEXjx%wsRY%;%T(N4tGېS54 %BqDc#hp oVom2Ɩ ۃK&ņ L rYHE8&&b@\a &)El\c2!S\K)FlCo`1aq0DR gTgmzPKyOU6* gaA.PK:@q VW}ye]~T`nrMwֽؾu4~Ie&/.7W6YF[ႢavuDqF'lWSQ;|f4WlOrsgN5{pS5(`P\h_=٥^ݶ@q}pTP;_xp!eE6\Gg`{$PPts. `D_Waue[Y=;Pҙ|Kf+x 3ߨk ^TpZ޺9Z8HSGQMkKז`Uu@0FQIL 8UwlN)qQBSJd 5HQZG5rk`U o\ui`5&ރd 3AMĩmU6˿030-EP)u%\ضyΉA)s3eT}S+X =$kI~Gs!o *R'L>vW08 |E5;WEK `n!U%vuek,%ϫ !~`&rNB2:loq#S-7LĤu^]L]8R5 (ao1vC zNhWzykOcH)ڝFxӝ;iS":B:1"iy sI&>)l=,p x* V# y!vCTޞFQ浑BPw⣰ʕQ1z(;qed.UC zuA|jc^Qk#w֌YY9kGtjI"̕QA:UJ&Ѳ3'P!t=eJa)$Ancu 0at^]aE<B~Xءja "&J'),;CY~?++B_% ?{z0ea c;.gɞj p4ʫM%B3w/˻7@~ iˍXN88#B0[PPuK(ԇ"m1Cxlڞc dcRSv])(%?&"rF<^ieLugB~ᄌr`ePΘFun_AO_fC磣|ӹA<O#O*1*f S%5:e@JK p$b8FT|4SMe>~Ҡ_#z׹K^3lS2{hdI[ |rӪXH:7>^"_MCbK E.glǝcw4sX`a{V}.?RKL[3yeO<ϐC$tt/km0z[Gx B={E R?ޫ_#ڔJV 'Nr%{5"q^ };j0d Ízu>#vZZb-<C!N]%so.RmRJ~AMWTUӵ;j mOӵ I@S-o,֔԰RH] _V|]]9fSQTRxL׾2}%`#eڋ黣02}"%Yڢ [=@(/Џ9CMH \3,k,Mwͅr <%u@ gJF.r^k^]%(Bbeօ;=@D.#l^CP 0LsC A#9s0yʙW|\R;AyoGAۃHz[Q([ARGVU+FTQ#sTr^E9-F:'粅AGsݑZcqv ?W?ftRxdVi.9I+~SS8ѐ"@5yM0Xc1KSX"G46 W(10fd~/Yd@#D -^8I7F&x /`,`&6B)aIjDbC)L9r~J :1& ʩfUhoLs)gYas{͕61K@eS{G!`4NdaZ@s@ &Ԣ}Bj{~_!Idzpχ76`n=(/'?~ʆ[Ysً.=sNXZ\v5|rlL_gElFcc \ AW_x~+ ݻ'S~dŲ7r?l[ֲu?Ur}P`~ߌk|Mx GUI)>\:_oˎSwkOi*4%ڻ5~Q_F 7mJ ŕA#fgݠxD :_&dѮjHŅUՐ+5Bv\7yW=eUѻoh{Z,Ɵ^Ի{ۏI xR$,E5"WC.|vAkt[LI+;RA^tUd Ş,A:1zL4zVւDm6 ~vWQThg͡4S$oML&7y?K:ˎr|P3cG䚘J郘?DT* !VC̗[H&vĤu^I! QΎ5 2_ӳfV|g?tC}g)d:SVC(8 _@J$JM#Kh4UX8 J<|Iq%u]wCzL ݑpxY\xQIΪR/:< .*eGTQj%@fH@/ϋ~?i[\"N|29-d["$(lnZ)>^61~P+ 3LŞ0nZ~5)Bqj--p0f}, nVg(RR~ʠͰHEU"v2G , Y>Stiw/˻ҳ7~ KF?̔Qf1$/aba$ hAyt5孏cPcX99rúenZ`-n{ 9 +P;!2h\Յ$r2m?&0e8, !?V(ug~ᄌ `5yDףk^y=W>T!3*!C M0,810A1dk J$ $|܏޻B Y^G"m]zad%r~(] }\MwvTKzBFoMB򻋬c1ܻQ F)V GN;b&[{SX߭}zWUϻojdHRɌeDkĥNHXFe49xUljM"(my$hjihȄ*A5KLPB*1 ItXZx'nykD/BHHw&6XXTY NFP;D4P DG/=loFׇmfY=4*4ta6:Η\9ᣕ߬ՉգuUl_C&8wh߿G&w"; տ>ypbQ?oرbXϷÍ-?#?Lbztw//zg`mȺG[NfM8ܔ l8(XB/98#>ٛ HrBF-:$xep8+r8]UuuUuuUu5_a:Ka2jۯ~ӮhD;˭$b`oh9hI#ƏS¥k$۫7^ ߃cU)8Mjޯ-;0o&Wn`Z̜, ;" O͂J;<󼸒2c*XW}_mG_ l*]%F0xIU8 E(-*&Pa-EpnʒdVSʛ7 M1Ȁְ 4EكW$w^В{CY`ˣ>/ffWM)o$UWWs~P*>A}_ J訥5ИWg*z$M\p&()^]_n3.XtEk,8EX^MYQ &,0<*5^W7hwMhQb&$$hE=M>KOJiϠY3XHmj8yFqN^|BJ?-&Q6RA]0C5OA?&udz(Ҵ#}=+X!x8p(6okǦ3[8bY̒,& lƻ-Ր]--xh۲Gj}~C֘*qBL$!Q-2cq+D143S9N쇻n+&dZ#qXYpK-܍ %C\cVȀq%T77Aխr;w%}R0cs 붥`] dh/ u ݰAVy3vϻFvP=ϵe>)jٵX\vEb6= $k90^.e9LkYY[Ύ-{3u,j2}Xf܍&~eͷqg3~lpz;cͪfpu Gp--0`v t3lb$^/^ JUz (pE?aIa"P+KRbyIqJ6 6QF^:UKDu6pPVk%FF J%Ŋ1/4)pSٻY[uBfG@Vѵ)W<[r DN9΄Z.<(Ji-CI`%W'b*08;B\0ȱMw1 8߅cxE8LJD# ZZEغ;,)M 4"VjOJĐY}k9&TJj*е4}q,*%P&Lv3ZlбXsը?Ftr;ZEQitlz?&ӸAq>npZ=W1f ?6cuc,)_'aL|8xZ?SvYF2зd>_2r}N~5LaXl8Rzm0rY% y"!Sݸ֥/eU. 1`}|~~a6@"R}m c+_MY91U[⍉ݝ'cN/5@fv=z [Eb--fHk9R8ƭ%R #FA݂ m-uIld%=RSn)sBO'ωT%6ѻ\ ~sL9G`ȏMۘܛdzG3ۋqJ:X}vSIXAc+Д E5q%C,8' (%Jp!:)~\Տr[vT!G@z?J.ۏr0o\ )\qw㮗rM%#|MxH?AxsX`dK(w\piN'1h.$fuuj?mQ+Wed=@.n<ɚ bw䵨uDZ/or}|df-(ܣŊh+9<%lcX :Rşr3*'u#Q/eLB"QM .o[PyMLYqK^#>>)"p>1ؤl)$殼*9Mfip d=ڴs-B"Z!eK}K14j\)j ~ '(᬴PR/Ck,A1#MG9Ɯs_h-0F36[o|VD7޷ٔ\R9R N/9Ut@*ˈxRBl4Q9W hQRsϬ2H8S2-NY"3x;WX7'^ol7@ ]֭#Q@yGe5&`u_f3`-|`;S0hamT1"TԌ؍a%2̇NJJhn<>i̩P qLb!9l-e%wj߻];5cvhN}@PDγn,[2uWfzp\ErJm=sN9]VlǒV^Ohx͵jFWSpÌ,"ң* ŸxЅF3F=TK9ه ƈ(N<cӦ & `J'16>/&J㲖`R)lasF#\X)2 oY9L(Ӹ[K L߲7[js@f'C9˭|o _tS6IqH+(nE(d ,mLy^L*YjfspOmFerKCATĪ)bc<%g; e՚ ՃiԜ ~\mԜvBf#@ ,aÅ[:7K׭Xoycj21 Pm"b|onVлF4wu+LnÅUQҋ1˜ 1'qM|rRréW>duGTaCu>mPMDr$ c4i r<xʉ1H~&1HE4K*\,nA4u It2hݎ<%?;dւq=\Wtma5W_no}~h60 I!Zfm<;6?z`%g{g."K x! qvq{}Yp01+eW9Qc!̀U:>p2u ,.O~:`o|F)-a9)38Tlfm9[o;߄ADҒ0/)LȵYˀPh-4kRs2U"`$i|UZ LĞr5k#*)R+l< B;+Xp@9R:oKc^Iboa5&әۗΜ 9RBNzͽ^p 2aU) -gX^JP, Ðm6o5,PWŦjHHOj=τźK >J|_Ou% -4f0K{Oxǽ!B Eoqz{IfC0<}z/O|ٴj&=\˾ <cz @s_>9xtpTn'D铧D[?uslSz87ϳUӍ6}}oň<{t&BIRVNwm UiaXII9] y鐶V!/qـ( e`m(!\X1FR,g>JW[ѡ$N5wc47vl.Na/: \YH8'K%E AX{'!euTecGOTpjp'c4sx1O xק y"%ST~nJ!h\ RD'w&xTcԛv˞hvkCB޸f僷ŚAb":c4nzW"{ڭ y".S&FQWi]*1eᢉ!4^)Ҙ c1[y$aωdT]Q<-|dj'L6jrח ]af]w <\*`c=VFwU'c0-{)wyEwap3_>X Bjq̍1%ޙZ\iNJab`bx1/&aZ㸑"no|ɢp> 6`nbYjNWI=/컵ӥ@,N>:=jQŚi`i1t2`PF B @>*E)Q2`YCՕ3WQk*7hXdfȮ'z(ٟ(8^DSP`VH/K-ʔ4sP YqZb%H`n*&E }0Z)B (,(yNDt^1X. `_@37(9)_Vgr_ 1z6* ^!TӅD=myɻ.itIRilTdE݅9o?,[X,V> ΍l}e9/唶~ $wFy}秩Dž4cW7aGPtTd::鯯Nɢ+8~Dv}'jIyO/^9D:%^O++yvih%9$A_ϗݏ#.Jۭ))܇qwC5GNo61Ȕ|g'y랻dӳd"K'|V5+:putUܔ-[F1H} rKT4+(\BQvS{]*\o=ikPӺDrĞV PL% i4<-0rCX!3"SfnnDe 3b-jnk'TS!ZT5ef.dSf{*gZUw_?&\_M$w.>\_\}DͮrfC٤0?-?WS;/҆y8;IY9Y , i]ݧ^ݗ Տ? 7hp%lCBfXO g;8I܍PTmvTFAMw0_4ߒC9tjч0Nm &*m@tL nVQ vrb#/(n%vFWT)d˦A4w/ Y/(jXֿH8y͈) _v /|67="ܯũwsG1eH~h=;鷡=w9;/.d609B| IZksb:}tzR~,[Z&|s(w!LF'KszS=Ӽ'S܇q?m:٥o<~M'7Wv_m,^PM 2^V`KxLQBN rR&ʔG^tõ +=]sHYt=M(ނ7Yy7b0 .Uo=á\Z4zS_m!,~|l!ǝX ^z(H 9RbO(wɮ2'[pZf7+M߆O׻;5s !:z+Rq&c(Fsx-*1Ggs 9UFǂα7Wzj?NtFwJכwNj5]If ?Ϸ.I$$J DEe Ιa RȔBXcQ ^0AҞ=3g'oR;E#m_tvj\ 6%AêClwr[|C>2H>Lo"9.%O7efaSMäiKL8(Śo&X,k$Cvlo!) {( (3SL _FR{[Fd[W{-h] bʅ2F# Ar=m _/YZTXVժ^QT/Qh)[@@V˜)m**Z$Pa·.IVXnxf$)gvR{ukz6ɗ8M3gd֯%Wà0MuF`F*ICeM}lGD9r4#gT>5eQjf|w̌!kӼ76<Gfmpxgmj5|7Ã͉ZdÍHonG/?=8hDz Dq}4D/_+u|7 ):fĻl|QiV5v@R <$WՄDC@| ې@Аج psHPY mǜmBS r,R ?c ӹbs>l0A՚ )~K.TOx~I]l7o}ۀQ*=y4q@z9jq2oUꁔzMj`}j>Fxœg`tk S.M\P 8h ƃw2 1:ޒ&ƽ 8,Cj}f-淞ٜjqc"vx.ꬁ{72\0MNoC"IwM&+KaQVRѵ0(¤7o66R!L^BS}&Q.ʔVweWehu,Lš/Oj!J|)H]_c?ϧgzw>}XRtl+flm S>t掲w>)2VYx7\;7*e['dSOi-羶}W\rt$!߸)%FglY7M8uk˃i:pc`֭Lօ|"F{ *%XZ kclZJdܻvپ[hc2L2_Ufŧ\0wP6#)7\&|a5(Q.7%@X(ou)H"y[粵ȹ|l!MJ^ Jb*{PI& J# AF` 4h+d;Gn){ƛef5br0@e!sFͲ!Qkc猬07S/;`!cwFM7#u ׊ݠeu`~%R٩z%{#yƲDBp']//p Vp LZssmc)/@)HH`˻QB8\i-[`|Pp1F*e~sDUY"!AB[q^GTwS)(ƴaFhq`S_ic|s8WSP;NUJ6'`0yWP2ΤdϓSÐt1NV>PL#3ƞBKm*H0؄%B,RUJI3ӞRG2-QM U=)tCm1&Oav$ٳ tjsT_g6DC` !7Qgkvq:cfy@Ji6·ilj̔COkf) 'm& RS7nD[6dgJ+Dfdij-Ң4ҮU^0\?vnWFpt1F#רww7BcFE8pUS xpnᒌN! WҮ-fkmنbz*L %!eI)_e⤦khEEݰ:uPu9PHlPWAEfMo\gw~d_^Z'/J0FԮ}*9/rlR?HWuYT4^&6()nLҧ1Nj]ӈY][?IMP00)ǩK;تKz [ȶ5csUOa8ݥ9nROf*.Xq&p>j+ZiQfб3ynPÛ1A;nk =v%vk.( ·! Wv#,t[U)9T%uİ7 m z$oGauoӘT3Aa7lxmXJSNJ-;~ TB/PoltTE:6ԛiwywM_Oȑ㜺,di3`dB;WFiEJ"җY<В)WhTbdAoti{l-Փw&v0'oQ#П2/)㝼\_wp.pifvU- *L>W9ҒRno_Iic4Fc'G ᔴ F3`a;!I+6c3V0UՂo˅lZ/ݢ6ysvN'|̸"u|PA ;)橕3l!Q@lc2*5 xr% \wEζ&bDXCC | 4.3bޫG6j/s qat5ZAĴc* q=$Q2 E%"B4BAhX7r4PK*6[+Qu< 8O5*RV)vRKoebqT @Vnu \kѿE-7?=wϦh2%N+4kdT?L|uh4@;N-dѼgϊJϒ>,%Q’,cNj^Li =98-\}NmPz a83xlm4 :}YV`o77#OPmPN oq5z\m5q`߼Z~B[T"1Q +8H(slDHK^^Jq5ϲmz:HPԞNdCLWl #?:'a4Kv'&Ybh<4ίoMa5'BHt]!ln&AQZyv#2@¤VشV$ Guc#W.3D+ϕ#10"$ cl Jb[,wRves%%t+#{s}A!a43[&#Md5y H巷z\5;=;PeqU4m Ym8*PuiEV8C[.hkBμdu* Qc\3\r]4P %U s`U3 T7Yj+4]ޖQEΘAǧ7J5}C\BGP1'y_~+ˮg3*8g!;x+ZQ!U|2ƙfRlџQJYTW'{=n]99ejL_z狖 ?] sd4G_lwՇI Ad y) F/U6ޜ`KrV=Śǭa6ZYx%X QVD> 1ؚ'/}gwĒ3g`*/@ǚ)Zer{-p *f! ˴WăuP%-GצCAE4QK5N'Z[}ry|Mgj t֭JY(/1`B{oya742:ۏjd;/{IXntr}PaC_I{c(1jaR"ޓ/__5 4TU ֮NVܡk,l^5,q/ 1Wr%o;@N %ɥ9.e\{9SKB-۱m9Z0*flkVG ˶PMf_Op  ݨ&tc#Juv?AxÙ2N< C Faj.h!btLe^b#!Ѝ}JvziTzIMdԔRiq1j¬DfcQHhvWz][AKv">ћĿ ,PPa7PI& "Pq,hI0I5&\q|L4hyr#(rnX5 =ta8+"=GSj[fQ-rRq.%hTsN:8V_+9EVqNbٸAr[q+j)U`8OJp/8Ų7HO~Gp^'zc 9 }7?ygDr漷$2mvp;,M^6|oEd*O*4Qt%N:O ˈ͏MZte\7տͱzP%a%J "h$JǵA$mJ8΃B6wյ]L-h"~?z=dC&KEDϨ"%29VFJByq !E 7ӏhv%r:3Cԃ>^OL l:#n ݶ|~N۩}Ȓ.S-OIJ}+u C yN6QZz)M!PQLnr,$} xi;w7ˡ܀Zʓ~ w)M.87Lc`W~UM~z;]K{ 5XaTY2=maߐ ~зq W|ttPIÞ3SIwPsP;jYR\!e*!ltR>-<V#[ӶVLij;x33EA;=0KXugvT J3JTz::-9Vz:'7NzQ=L݉W0tIp:ғn!i_C^b3~d\(()Ss^-x-\dHND '=ȒH^-N; YXıՠM &oHh^% 7J:f, ƀ&hҒ1b,P8qs'5T4T4cyR)d4IwRi$BpPS8gL40tx,"Q\EPnȘr" ."(kX\JPEcT[܁ QHڪ Ё %KȮr׫<}kB޹z|b0r(-\T˙"t ,4abwuן|tWP;t6_mQcJ"Jѧ7Ґ1;Mlǥ1Lv Ҝɍ1E!F7KbA1Ij(s]»Gn<@HAB FW]15)GsnpL_ k0crs :)Hk;Er_]# _g7,g糫R2dNF9׳b[?d]b?n.&Ȑ+wbϺIA?s*s*s*sɝ7P_ʏkc;WB}pe0 " Eb?V,lR'#)ctԸgVۿ_VOrKhQǢ; ?Ҡ;O/h=_V[ *qoVy;y/]%77Fͤ YǺ_ݵ%_ {Ԋ\C`XJ2J*thB%zW;03وYnkrߊLӊ?h^7uٺHJpM'!Wn>WT>?A˃u=BK$70VAc"ZAv%+-˜eǢŻZ>>6?%p;KƝv0iَZj_[Sƻ[;88Bg=a *M'ͶD0?na wp*qAZF.*шCaZ7dTS`w,1!E?{*W-nU-m˸Xʔ6y}P BLh((utJNk^"pF ]]9:Ѣ?Hp~$# }/"rq=X|z7P͎C1C,ًFlИ&;PHX@%QIjhEG)w =hJ}[* x:C͹c7ޙݡ \)d|fT=ѻ2h jV5;Q1Dn)jFp `5, 1;ATD@7X99U>uK)-Ux$J?3vxPALinm]Z|Ov9u)~>>ssY/6&N>F=腒;ݛe< { tRz&}Zm|wnW.0 jO[g·طnw5[^իk=~9^|:-8j'?R{/=f:gŻ}!G=%w~yf04o\?}=Җ,l4=ܺ=r c!Eᄝ[SyحRKN4d|ͧi\kl,'Emܮ9,w9tQI)'߅>h=lz s͜#Y9p4#vKJ\:Rl|\阄,P'~a3`},j#4JOɁ35c.8tl+KG0 E7d5=(#vT> Nc|m%0bO"I |Ja~{Omn7m{3 zM8̭`d?c{Fℒ{;Vm'IhҮ^׉ ~kZ^NIгΦG~ͺsS,R9>_;&iϙu./?$yn@:/C6b[HT֑!hmPֽ mk+*\A(yuz=:^+5wz)j'hIGBYVtqww~ry6-\8k<@>:Q!#laS@d4^9:h@(aT6זPiڲۏ8X, ŬjF,0r~qn2s7í/c@\x}zu#KAfkCQZ-?e hfۤlJ1Bӻ6Kݡ,H;9 Y @)!NVj9 "Yatxi;^W){;wI#KQM+HSp]).Gt uEȣ(Y Eu뮐 [BAă+yYyIwP'YOcxm7HZ]?{ Ic`3Nv!K.NwdO摝7lSHTN838#l1v? lf{'y.W./Ov Bd ^yNC:V6z])/q S߽ g3uFȊCj1 )XE" #{-of/ ^wFia Sa.;\|oxqT/PST,7@Hݬ ۈo7ka&9aj?߽3Z97 ::L=4RYN$t-c\Rrg2T VHrTZPGr٤\II!H8AeɢX+9 Cl/蘅.+ %z` =Q0ydڅ6"C=o"#|^1UL:Xkb&62'yPhjG4&fiWG`. 8Bq΁l rjIكC, 3LQ;>uXHU:'*E 6)Y DXq/S\* CY@%B;++EEʱ#=GaYyDEi!G%Y&l,xS2Ħ)Q(9DU9Z=KL{x1&)![@NC,lJ8!\\$ a5Ƈ)kfyp!g2@V+T%odF4jTv1qIV l*DQ^IVq6amB8/G(E*l`]+Gl5IVHZ)UwZu,'1rGzxf`J!ga-I 5 8XK83RRl 3SЏ@]&hD/}hH/"Ȭ@ / >i4u?jLNaOu&m8iA@f.eN]anX(g_rJǞ:WLY16bG6&wd I|^ڐEln,wӴG M{[-ZZ[H^VdYV;ub%_uWY[MwS߁h {b+C؟NI~;/|-Ohwl`nK|~'׆_N~A#h$gT麚^+窽{1Z& 3Z(s)INn7tzlD|nKAVt5L<0v8mTHs3{^1J[}ql$B.V;`dG4I!6kɖc:j8/՞nIN:&zQ8"sRtp#|t|eYۓV` y_5zUnhYNT[QՔu9DV'bS%Qa gbT c`> i8=td?1,m*ݝN:zr*QD +QKM6({N䨃]`mq0h[p3pxk>|8̕.x-zF~ﶇc꿗M`,^?_]'_J;Iː$5M_bujhg؝FR,ve"Uнx`ˬ^7n.Y4)';-(wlSMT-ox"5vU-KƤM7;m5LKf~̮[6Nmt(g I%9m6d1|uh<> #d0cuN.q_o&qoz5q1Q>DhpnN2 u\\I ;&~u[ey%wmxl0lXl 4N^@-3\%[UoJ|)t<\S!K1/Vك"S +я*tqqvYYzHO dr5 (AQo~zT~|Cld{+5_t=Y+&H eo<Ñǘ9-aK;^'O2P"O-y&D_n"9:(}/:!R۫Q>_ҥBgU//Ni^'-竓fd$#p 6V]SXNx6"ǒcm[l޶xBvmW&ao[ NCMr}`{(Ki$7ӭ{U~^)}1)9L_,팂C!{'[;"9ٰ%>PVf "Km+uYi!mQCM=oiM) #t!dasB2G 0#3!drU6%U1~JIAb9>TSBi1YY $QyONJ!gRcA,K=WJ2:^19X7dn~˴w?&BS/?*U`F}~ŏutדz윟vYoU9Y]\_W'˷N_$nnh5x_\ͽ;>WՇr彩=>H?{WGcJ0/ ,P}Aܧ CL{%+;nٍ-I( d㨡S 1g`aO 4y G QvCǎ,YI>?#y X£h)Y(`h7ZNܶR2V[Vzcpe;?޶sV`~ϔfJR/Wa~Z±/}V+To$zӇOE?a\i- \,$.,i( fO0K[¥~ Dm3gfҺzk6to<ϟnA99qP>d?ElŰ8'sm'0!+%~ C!b̰OՌ06sk>ɎZYNEu@w(|&o*'_4Xtg y{/kj"x~) yDZahRlc jKfx̀zQ3 Μ2c[ r{p7n:{tPS1v̆JҬXjF[&osGqp٫+/X!&tNvቨb9&q!NAJ>Go '_3ž[ nWLa*aZT%Q%/d(bVJˡGanPQuVʮFLKv6²i~^ ɡuuSk93d= M7jFOߒ'ԅ Ue41 IF=#q^z΋mN΋6=13kB33MEGg4iZ(Z%35`ZT8Wh2Ghٗ{3oVɒe٤nнܭ?#3.mA-ͷo2^ 8g`hT)v3^+,FQcze =FbZ:IJ)~AmlqR`-47qJ>]j֪Kke8)4i14nLH94iǫN*}~MLSW ޅ˿~߶9.;&i:&!S3f'LnƃQ1[=:V5ĉ}d򇬅lQ!|/+e7ڠ'>v=^iz& :3bi]|H qY}>4 ŪvXM2E{It`f5IӀd{VHM}IU{. Ё|5Czh<$דּQܨאo<1WIx5ivubAS^·Žr"G=ݫVs;,X`S4[0=DGOxcs\oI^M ء&+Հdc}"(buͧo ‚"!E;VR멯C:CjГS)ZC:.CjѮSFC"F C:ꢽp]AQbR뗺7XMz2ɳ49_Ɵw(ޕ//ox/ٌ̀jSg?j<)Q9DmSfDvXP`rs2+o zG'Lz0+*;jŽ&I5&ApU:nK̲xN ֪:oU߁yntM] Q9Uy.Iθӕ7z|Ԩw3~:7Ǫ4@^aFSƫP5ύW[+%OcUќTuێN-EgG۫>*b^__6pO.=kFyOQ }luQEkxmދ&uUɖ9Lj]SKY4{_:%_K(|Q,blɷ#W>v)S:DZ!G!go-sd2AbtZ{lu? o݇N|~ $λrUhibr}i60;QY`a x ZjʩXvԃAa  eiʃO*GHXJ{w(KuN%ih-=il,֗mXWezQeckӠvr񈔵kӶYӬ|4)n*Oz+އeRv;qV(i2NY/%gZHk[~ߍ:P8??%P]vt~lr0L֌%+Knv܈5)g\"'nڸ!5ZGY QR5jT[A|MFлQLR0Z#!\̽aj,^ҠM]/'H!ʤ)AжDNwN>Ȓ?j٨h2<$ P3 jCmP\b赊:3&A,BIT@3>p8jJ-dd&zm?hw1827⒘!jq|85)u]Gg Ds =D Rd!y$IL8@!M&`IF(l: 9$DѼNaޱSD]SICG+Q ˆPE}I}vTTJtn?G?ia*-|:yUgX|,ݛ?/ J`9:`xĖY> 6q+?p_F ^}5g3{G;طW?^w}WӤpQ?~R4e)&%o]C|`r'应ft$=x=e9j8-A&>RtDލ2dϺ9H7h-*״Ts Si}Q}Q}Q}1V8 yi>IEd?CbɊIJʥH&TMJM}x}ǎ}8 <6 %Kn^gU5O;y"[)GQ2ڔNuU9gRg'txL Im*J*0NX>vkӔVOAaz`8}ZY?<=cn]'bpbuO߶ s>\ۭݺ4J/KC2in2KXqf\GǭKz<)#~*,{;e ֢0)`T 6qۂy.Q^s@u6]kktR3CNdRr#M9`Vea=*3o1Ӝ8 Җ'Pګs*yu]ۍay{vÀ}@B`!xՌU%܈7IX,7t{`+~SfY^{4$ q(ر AhLqaYJMPUxPEmس֞zmttzʁuUY];\MdKй{0kSw@UML{NNe":)wڭN k+=Z9kzC| ÅXm.MT}6AscMF x=*XөOm3ҩ4ɏwWTETN=h˅6JaDQ%ʯNe(}1)*w^~xv"T;>yv&nIy﬽BU]}}1G7+J$฾ JQ? <OVoLRxJZ tv2c';"<8 FJ624Mb'd j͞|]5v&v9XuZ9 л.zC9%xj]EޚDTӪ4Oef:]Å߿?(kT/E\\ި҄ҭgm~&gzJǍmrjR\)CTlGRk99T)Q}D?Zc ¸)K,W_c\KQZXnb߽E^{1wQL U`C$f2Ā=.*bN5x3DTu7MlT̃2O.hwԬUͺ@Le:fJSN9`D dM3gOYؽ&1 Dk"W$EnIrC%0=*$$GS&@Pss,}%p$xVhB[M?0l&,NW|W_x QR]KZ)1Q `ee;mvaP;|-?{Wmʖr[w+JU˥XjѦHewI9I?`wI.̀NdKÙF? 觻VɦZr#2uVH<>ʍ*{9]D)JTú乱v3X`IF:ENvϪ*Y+S0b € L WD$uYK. 0c5`X[y-vivAnOΔP0i2d$I2[ٍ}R%Jy{G^(N%fQb!DN-TK9qҼu^hdAS)JTVx7I)ZJJm^>Nnf} )*Ky*UiK}ƠCP +(ux^Q\4r"JPDwt ^3cןoj;z=ˮ_ >Qe!`\vۙцqYہQ@JQ;?京*snyBk:X9&ny{L- 4{O +"ĝ8*(ێ?4%{`jMbT_q~؎fbWXuJ뭹ADsyLjwGwN~H&vD< !dHs* nR(]E |0A6׶Ne|Kܑ+>r~5dZNrHiFy1 /ʜUCKp7fG^E-tZXSyhBk/#NZ{auR0DmϦ,YSӺⱎXɢnar|q{ Q b q#~-||)Fs 8CS^q ;'}0#N@:cXƘ1d甃YG, %M2< R&|jd d%#) ,3iQY ee+w wQpa}5>#.AsgH2Vc sm̠;5qC5L#0~̵Pvybvyb3a r򣮭!%$}htu:VRbFW3YBPF6?m^Sq»uA2V<}{RVcpsy=V%L8as|;yoN;DoM:߆IGzؗ##@Zr)z"]Ogz.rC2V`H HЌ@W+$0;CdQpbj39W''P9[3A$y߮cxa:2QGfpJt#Qq:n|) _?pOniu tZKX \IΕM|k<B ̘KîhR"ars"PzhB xL_J6=3vUF)OUXȲr]CU=C#hɽY,v# ר(7^ 910yȺF{JCe!Wɲ>I VF%UptiBD)6pШFQ*qOf855LV uڈy:=X拾Kgz/lk֕\V2P V]BqsfjUFZZW̻Sj;=M[$35(sYPYj ,xEE*&2.kQi-HՓr.h[ mQeщ)\00:Dq#bv]g( gĸG`3mhˈ'vq=a$J{GNZƈ؋w)%H 'Ԍ y]e Y%eIۚAJZ^Rae,$^YYRU+G8FSA׊=AO ta팦9t.dϭ Ez#A p;=6[rn-IHK],=KisMY?>þĞ{B%őD [C[!p" N(=a.ڜsCOҀa?Ic%}^hqNS̞֣k5FAq5H>U~{ru1/vE3Fִڠd<:j̶f;PɐةgL#0yV! @ar%>#Hl%N"k^T:F8U˦P5@L`*5=DJ`.V+@.O };'ǭ}d M׀/yYq#;ң\B-Kƭ(PR00I 90ehmGW9;98˂@D#ӇF$ v*XC9"3FHeW1&;78o_&(VΧd(qU輓7̫q٬Zg[^dpm^s2X~$J*+qL4窘*j^12odF*X W @e ܞ5*:yhV7~>kZV uS"D'F+n>nc^3<Ùk6N!҃0p/ !3.p>3驞Xˤ+QHƈ%J O1kWۑq$Un(b%Ǧ{~{Oo~q}-FGD g z*Uvf)UqKU){F`\Nț juwETը8ɒ VFSuŕwQ9vtm¼>1쨰ؕzZTk؍iR.'RP\,_꧊W(>0)2Wa8ɞQDvbQSF V"L]`JdΙ WTw=^8,ǰ-fvZq{ө航/їv{Laئ\ҙn7G![{ = CcY؀E?ttЗ_) Aps{E^~Kd{=L_}&J>/~^E'oVs89i`Exadaޡ/Nt˻ҊyHsCz^}.]lO|b-X';gӺ^+RCX,.ǾZQ&yyM8k"x 㒝*VCq} R43A5,V15aHX$lCLHIIcM1ào0{c:x˾z]ѱTǹ;& $=SzXx?$`XMM^o7MF@ cHy@8s8kԬs0pc J;q[J8$ap݀;S\7=Pf=[9|.OƔDcǀppJɺW5BE+/30ΚW1O ={r9cʅϣ3uzeo#(mƜOZOfӦ[//e;6YMUE7sWeU[|dj/I_:t&}4Ե!}˥+\{S\uy}W|Q7h7OOʻѝ{V^ TR+ /Ň?#V#bVXZOWʙ}baLNkO؅eXn0.l~ OO>Gf0\g6}kO͘zb|Cfrw77Un\_е~*qvv_}/G~Ť$f!ʔfpgz0}x(%t3bޑqB s"QG$պܖnVgVUQ-?vVmQ 2L:"ZGGՓVQӣ*`&GSw`f|GՍhohfGՓl;EU9'EVN|@eN239";!)-[k*lߨZ:,8h^nv<ʀ1XOVE,\%CjQ| r]bru~oeO;%PZTlN bv;ޜéwΙU=)vwL]j\_.wFchoiWZp!|FI}^ L -*nDW`5?&X\ryV uCz7Vn$B%3wmHZq>,;f!`7ٱ&HbeՒ,d[ɇVYXbT0ͨ(l! 0L>NJme. ռ9@˻(= ̗hK7@isXȷ**@=sG<9t>LcvD5v%I B )ll&_`,:3`a٧O?Y 'M4ϑ,Fel>zd^*քvuz*'^ͧcp+_hBW}33D7eRc;{/F5{`Rh!E$#P%ٰ[x\:BJ=)0ô {jb$9&"YݛY0{y `2US%xx@_H/JR5/]v4lcCQ!wLS.83C0B>5bpvWjy12tx./EkxqMTf?è6  \i"n14NͫFvB^4SYN9IĖKb-I GKsҨT$NN db5zeL1mJrNb-r JK5䣳)YL  e?Š ߅PZBH} -ZBsaHi:!hB@h}BNIڄ%Qju ոRLF2{P !e ~){KHV^ߛѓim=.7dLXL3˨dnL7`=YJ65 n r9D7Jju}xyxLf'dy- [1[]4~[#6ͳ,ʦf͔Oκzb@qQ?Dz*I-LliĈEĎ)J +bv1 Bh XQQue?}By ׏s]CUReT>)XQNh@bi a2%é?ku`}5HW'$+DJWzS UB1e\'3e NkNL  ,}-MEXP(9$v$q*C4fMH~y =i%h],]AJiDrT!fRթH@!܋!s2vq΍L dpjԢk"QI-@!*OO.KeːdrD`"Zjq-llyL,ODph:MSAZДP(Uڄ A: su_ Eu@]es`LzC_m~AX#L 5Q.PX ާ08*%Tz&99" ["O+♻JH(4!P[K8 *pz{$~o@T|Q~B Zj&#! gӟ%,j7]~QW7lw/* _ϲx0 x%$8פq'%פ\V9ie_X*knEKBVаW$8eNVwq*1V&^oj^.yF=V'&'r ۶d>8 ,&yKJ,<alRuΚb> 5:RJs7չbaqlc Y+ br:]WWV68E|~{5%?>nx[vu~Ncwo7̯yXsK@clzh۩[dh<} my+M[>ls 利/ YSklҳr(*R֒qm$S!Gj7UȬ8vkʃi:FvB 07<6P!!o\DTP\-FPݚ DtQEҋ<]kw[E$S&Jb0v9ǸNYWc;PqaȶSxwTaSnфBÝ _J K)b0gqb1(I!,!5K!vC5zo  sHkaͱM|*opr;.,R)p6@6/RDԳRkF&SSuM&3T1 Cuǝ/H۠ԧ bRoz"$2Q쓋5gȀ=01n[ $kDⲯ6Ta@㒎`/†i}vV/9o6*,| u&{{7a#{M ޠA&Yp1icR@oy~~Ϗ^>?zcQ>WoHi(HF%"I n%R2,KU(6)FF`ă|PA Yv!Ц VWib^rsk#x|@ip;r"є'G/IM(aH<X9(Az+<8:D~D"(BIGFQ%JZe5#fG%bkRX;4cRRM(61)FJSpr%7:be+I>n6~GR57 xXܧ8mqPi*L*aN4M J-N0K(\DK-:MRђ8ԨmQcv֗#Lﱘ}+5-@t@z-TX[=)|*jXoO,g\1_L .?j=Z7'r[5;5 3 #]OE>"ꆃ{31CCK!&Nv|%&AY+{G8ksog9g3{<|c5uwZ}4ntM.[bqRcJuQjoݛWn?d=!9Q9 xxw@9WڅSHQ(6!S ]3ePUpF~ŧ4!pzƛ8s](!axLCI`6:0ysG]rbҺaA̡͵On}E@!R@ :j3%`dEaTa&-4!Z 5 QIy%^rWjVX{Sw0/iɬX,ϮbR+ZODE;k$!*ME>x|R~s}^o>џyޒ\' ?kF4q{L{A,,]iǟN"Ccf⊙x9 nm+<=ջ'j)Y?! gMtujbѧX9&Mx˛*TjmO҅k̵fb]v,35ꈳ(وϓ<姐uA>\s҈lEXw(7Mxcz%ºX Wk *5!Y+7F{4d˓#_ɂ2eFL!HfNrʑHOIA^H |7t `NX_H!yx .tCn˫ʰoZQ7&іk=AJk6j7=oIʕ])SC+PG^A*/wQVl 2*ݖ}yo^(LuA/5+vg=\5 x:lHĐ26Uv$\j^s&_ Cor3L8y)RhlMZ@+D"+uAwmf]>!k鱠}eNnX.꼀M]Y=X4cĪDFؕU \AjbT@K+{,[_O~2ڪEz5:+5L`^W^9"9b|/39s[Tb4gnpevy(7^"vR9㺟[a& ¥JoL2|f&v>hP+Ж]?¥JRfEl`sJ5M %TxR(c!cA1STue4:X"fՉ0T2XuY+ߙj>K(v򇬶u~Nc 7ӯ޵6r"eqvOu/?%ك AŠjJ3EZݺ*͌2cI~,bevvi@>YM{Gr 2P,[:ڜ5ӿמB=hc-' }(Ӟ:Ґ\E[v8uk+i:b l0֚Tօ|*F8gqZ7u[[NgԱn"Lgۺf4պu!_6)4#z3Aгso3y=b/Kr:{G'좸[e]+2*?>*,lbۻX.6in:hm `ƺ \*pb*01F\"sC1چro6LmjDlؘ,wXq͡u9uiwzՉ1)]sFd'tϡ ʴs.[;-yiT7oZ|gӻ._]8闖R g!U,If ~ -j^nzr_}:׫g_`J:f>)w8<$2c* ^URk r"98b@dN CAwNy26;I;lA .קn1-o,v!/{/5ku=-:6ު- Ƹ!vwoן_)A5Q|0}^1YP/7+ܻg f~':\iZۣaX2F\&v,I}?CBi.5pu5z\!+2~\ѱT ^-﹖ .4g;'i$cNni{jTc80C(K^}gp+qFΧ! W\ҳdoc4ǧiq{?ruXLo)CTm.NFb:ڼ(|x57v$}ǼLܲǐm!WS_Żc:T~Z{4hlp4`dzT<@74{?dLA 2WC@ I6Z1y{5yAcWOK.>uP %d8jLJXVr?*׼cXiù-qoո{H*nr#U3[*(BD 0s3N﷕+*60@dK5\:0G;rl%Qd֒D,nFWsrӚR+3ݭpܴ@.K%ކ$ɷ '5DHt'~X gEWY̧Z 9VS8\b.N}kПg[Nnm49~FjQ2xæ`MN<I0oUE5cd֠W3mTO7z7mZem$v:+cؗ{l+i1M,>viڱv)wb3d^4.9b3NX}gykgbh'xI 4mN{,)fp'Iq"]!Z٧.rs+#.gǜ졿# 0r&SksƂx|ȾB;lǞ16`@#X&R~wHJ V%R:\Z RMԍ!0ܱXs)Xly'czJM*|g9'[) 쬬/ ~gp34ۯ.K4v+GC3lѬa3|׆f-MN Ki"|YmkV,/lYq-{s#riOF/_9նLa9Y6(F,B"*=bOR hmN΂>p5莛b5ގcC_9?^ʹ K+td)>ŮK˰tv: 2\6eF^iw*1-9W,_d4?ڦC_ r˕1(r&skA4zvN$x}'D_ֺg/>hTv3U?P|ŕ@I(=6|D}7Ft3GN$.bo8oҬ*T} frUz*2Q k-|A9Z R9! ԗ'+m=/90(!8( Qr9 '!W,m -D JWn5Q`l S'%ˆbcJqf{6c\"i@ (eI~q8"ڈ`+l0 ݾ_9$1EaJ+ʷ,LG)BuiN)8T:Yoth@(\y(㦠p),e4[9Ւ0-T3[O)iN5DPȩV8V3 #1L 5hӚ\ ؚ2^7(!Aۿ/Q3{mY1'YP4*+^@MjqR'rќ |uXh 1-zVmP!鹭ԫ6e^]IE{))p/HFUfNG""D& f =ݨwYoHm.Ƨ˫W=L[ũOdtN.:zj G8Ԙq:\QUd3OkZ1WI֟vYQ20ZoK'wBSáZ ։1ChדW R741t x%/Ucz/bn4 bϋ8iVqz\{4JznO`'O3?z|ZLb-p?[,7ה1?;o'_n;-h2"{tbfiCrg!ݧ6BFxmK~,ً|`W6n3oȎA󪚠 g9Uk efvl؟iHM,XmwJW7n(Sz>o[c~M젃O$Ǣ fM{NALlfpb=}Y@_˜jku\Ͽ5Pw,=zuw<,Jyf̾!U){г>Ԝ|uwLe%H$6F'(yk_="dƒxC~R! y Uc. X/hO=! 3ay: xr)ag)RकEiἰDZŹRڛׅup`e $JɁoI'$il 2\\nha-Gq&X&s,02xxqagH[mE(BkeHz`-& %0[&%W1\TH`C| <Rdy-#!0$ЌkkrFE5/9 e\:sZg7/NM h,JR 4 AR1T3FѳDqtf,ې$Pve1qfrL"Ќ+b X'sĄprT)nbm ˥clFѣςJo.m~X3T`?WJ:>%/6?V֠ă1gL2NL[I@U}tL !Ud{!ٻ葉)L77,~WNk%7ߗ7ͮyEN;2 76\8f- %; (ND3 O 3ݠ |(ITWT=rَZVV}l`AI*^ܧݷvŌ0x}^ fb/_<ɧ4fg nPYF@'~KϢX0DGfǜOVϑJsZy,'?CUx=|(we3|?R-bwA "_#=;h)D۱}ww EJWg):HVwH0szq;cL*\zvǂj'{KzMJ0yR=-_:c!tkr8Y׃T+JT=s 6W&5"d"89Ղ@I K˩"A}RM_^ˣQY>,//j5|S?4-'m0\z-?]cSҽDpo-<S!!߸Tz`AGtBǨNPo-#NJ)eg!TS*-SR1T杝-ތg]F_ہǩڥ(nC+곺OH >Tì5ty { lNn5 H:$ͧJrީW&:NNڙ4 ruӶ^IHy2r\Zn\="C/ +gy_@,u!oL\?8]K?wM IhyC$ 48̠ؤс1uVfƊp{+Yw(C ˞:O 5HjŢn*467z"~oۓPAŜWI0W~?N;$ ]W/wawFqSk\2cu}vp_W.ߠyD2Y qN1M Da+zYIK䉦vHǒ4Q"~sy[KGL.y;#tQ0vɝl|WVޛI/ܻ=sZxoW1Ι匒X\n̗}˴Mi?:9Cwf2sf&+/=g :W"ԅ޼|L포 ;Hx sr }d 2Z (`Yix11=ށ=B1t  -'svxOAQB9wDmDpm')=';ޱ+ `^0{6{}@T_=ZVPefNgQk/ ,={O[e|b 5HBZ:6W+"ڛ9C4QbqڨxqZvόu;ɻLg Q*(}Ⱙ&h=+Z/QXq^P uޜHmS6ӷw+6njUcNCϖdjv|ɷ^sP$[Ն;M= cD-Y+yD=RCz f0JQbQKŖ0zL܃[C[UUA#7Mk z$y ?4t{ިr~y7*AgT[,twoHַ }+;Ʒqa pOFy~ b6U3K~/m)NNQ[1s[I^ɷ* ct73-C꩏໴CjQi<oJᚱ]NUu Πڅt.|[8v V[NkТPh g_mihp;훻nۄֻG^=A+E@[n4T? &)PMe膳^cJɪ; &^mCB)H9VUyI8$\?&ţ|dvO7ő~R#1S"Ր`f"A:5Q11FĊgq4R, "}P?ʳ)q̀ ύHC,y %Rj$!)50QFk.t*`!rmVT ,qk"v6\diVӣ0_|Y e}WMs_ja~|p|>ܾ ѨWd뫋Q6Y,W%4Mٻ}qHQ*o>+aH)Wh/W?k1'٥".FBĈS<`}Wkw勒 ]xJ+8gWTK C Ģ$3B3n̢b\`,*;3XvXq/tf'&pv=Gsv 5HhbSkI2.*:xcɒH"}YI`(ͬJ>{D+q>ǝpglِ@ ?{ضw[i`Dz!玜"BVOpNb-LY_V6#\XlhjoM ty"`B`܉+n.U^HtJ7pM D+Յ%mMX,7iY%'!Lk!m0K#UqȰ F?lBS6unu|EK^n`2&j\°i6$aF ӶocTyЙ/vӰ8,ثfgOk aXOep'r0=wvh:Z<" ݯr@YLJ_ fbqCd>%풔=}4 ! >οձ'c 3YnBt%1-1l G& 5ouVk3-#w:1 ֖1m?!}.XN(6|Rsh`NhR^~A<'#'H]ϖz4 C~||x%1=F6.&\;w1K?WKg;Λ(S4SA +0@z7%CEe%? ؎Ni_3 s/SO31 )(RZ آl0,h-BږD1a"%bt>y\͠]K@cx,/eK =-"MaT{)YCdy0zٓsI9MBq,:J{VFzF(__H3=[mesp w,[?ĖY\q!-q,OO 4Ci85s%F'URj/6]BXoW]lG1v:PM!7Te?Vj3i/͖|BzVIzSp-KAlaھHh'7۾|t߲ {תs mo7b:T0ԷoK9f|`8VO%(4-*.ZzZJD1璜BKHJ0.ZzZJURUQXJJjB.ŔZKqbdTHx Ղn 䵴2R;(0c\"fi䷮t0l_F.m2@N7|i eb͞bTjdu;OzvW&ۢ_~cl]O|Fh"o17^袬Ca#@{- 2wKǒ5:|@ u6+F"Y/壏Y \Hyvl~)(,TLyN㈃}\90K׶QI_l Q`Qf+, TNdq0e`R2S BY"qdžkX祥 aP0y﯏& N": Z-),Vqb DFii[e% cJkXH4\LP+gxmCOHdMU,Xώ(IOљk='Aڗi(F#t*t:;Դ х3qxnfxn q<璎oiswfyw0^k=vvaLBvv1%p{3=$L~v:e?2DtR${\ri?#WX367 2RHLRNlZ5NiЊ{VZ4EFJS@{צswn#cJ^޻[_v&!$V툢'mScT ¿jiFҷ7$b"ĆV}ڞi !5Ct;0GuN[Q!:`ͦECq:g#ۺcͷ-"V ~6 2^?M{J^LURzj&^;3u+9} \S wR1 %B |ZM7JDRp# S\{AݕdxnHͿ (%SkדfRv__2Ν|ؔDbJ]$vE4|M> MbL+FF4ddbLBii=0XAXbc @g~eT=qHBO^iJTV0^^iU}M—}NLe87ƺ3 $)Axݬ]&$biףu3B޵G'!@p^:/`tWˆj3©ĦKT+ /N'w:a(n}m 5.7RF_E1Zs0DK@zpRYQzN/}KcJ&''L!m'"jU RMPDr+g4SP=6KD0(?V5@ߡJ\:[Z[Hm518 '<B#&(hcuiߒ{nbT8">hB1qiJfިR `}DBi\)u;R&7+i1?osWG1gp/֭ʧӍrgT~O(H G-5SQu)OaQ=U}~:awK|\/O>WY\~1r3fDt}5Ջ;<{}^7 8z\CxVخ L(LIUb <^O?'}Iwu 43=oBRR)&:# H LR.|f:Ŗ`OT!R 6va ]ydz~ ͚;mw<D@(2]8:#0=p]9*9̓C-6(0LCj5ZoN$ :dꂓq0X#˜!o[F":|p"x"F;< M0 XcX5bc<%$sbJ\Rą}ט7˛*mj D RAWf=^[>e7m-m\,VӦuƩ7FFbEw_36`iR81^3W-f N)Oزwm:'r9Ue۶e?o\&Ե)6?ڏItQ+<*N?x]=gެ^w{H Rr^P K9+?Loܬ_}1Iy[EA eǀYE!,nd_VWaY̴u7G]x&ܰ\UTUセ>r|\eXO>-w77?N _ ή6[TCX^ fS~ZD~_SK&G3T52},ib*:o}N;iQ32 g<+Kars` Z TF}H%G7Y/B;BR:2P'"jY^8lEG$\dYQEsLYKuL41W "纐J"$$VCE1^w1*#H+<Ǹ:"!QCAN}gCLz1pHKu^iIlD0@& ,i&$$#SoJ8ZK+P/|x6jۡsI:\&x0&B%$5y;*62G)g8`h1gCX6؁,s2?ɽ{86qEB/+qt=ԓ~TZ^T;f1 gQ٦,lL8i{퍓7ei[J@!XHqcHjLf+< u_Tk@}Zc-##|lpRa9B"DQU>lqb!72h_TINzA?ym&[o{+dO2I!J+ uB~- G&"B"026 Q@ "U!"b,6# m I h" RL .c{U)zp8š`< %!:As64 b%qQӊa bE4HD5V Ea좩CUQ(VS@ڀq;k28V \k\GDAJkO  Q cCJDPkWj|}uBſk;C.\.y z'/͟.ni"þKs2I^Z ntܧ͕:@(omPZcJGץR1H "ݎh} VsMqNѣjbkvD/Rm}mF|)U3ۍCA iaK((<L5~`lI{J}]P9Su{vX\d; R* ,a%X,Y "Nݻ%B=YŔ҆.XgKl-GlSU՝ǠZvMjW:ҥl {sv`2LcyԠY l5'Gř#ptA{?ۻ:DKqőtom_`ա[qEvެT,ۯ5Vi: Q0 ,d+JևbvFԥ0A 2٩""kqHhp,{#JnfWC&aiEɼC\ȟc?F7`}c+ox%*~bΖwedVJHsHf L[8fLPmclH(Z'תsOr794y&]$ZEB!2QmcB(b1r0@D0ʊ`<{)ytoy{/ybe04֝GG@sgygwEy2` 'T/y|w4IwZn,P:tgq> ?qgx,NG4L$oϿ[ nS8 HnSpzILJ8]j¿$ePڝ}zk5%K zŽ׽t4Q~e? Kg׳ py2(}^aYZW׍~jϳ\ +<7 [Shf;gwL{_}3wC@xăR7_`|@ ]& ҽзj/$ӵv?Fi.-_; u< 9 5u/ˤ5H羟=pgp+X1x8c;S縇S$:peN.Vc*̨^tcM DO7?hA'.{zg4~фq.rUo\Yť׽ؚ;Ͽɠ7~l5ތ=s-}Ժ8z,^P _ ,w*"+MFy^ocGJwl:|~?3/>&ݟ2]Iڻ~u3Nf W2g Mƞ?zh5-]6sr=os[U~b?G9/_/(Yn,B[|W0xs}z)ߗ޷[w3 C ޹}PJ[П,#1i-~fP>=܈+HGy P2ks{x.#X(7u[{x3E㷂 |oT]y^ 0]ʶlWY(r wRpqy+jgo x-lYZZr,2jp3w9&vF]j]N(PCu;([㼜ovaC|ts~Ƭ\ቍ>KQx;^YHFa Գ 50;b,:`*icdmŢ0hs8ĖPn*N Oӫ;C`ps%vTŚ02x'?%63`N❦NEn:y<ę0p>s& sI'T%ar8"]+hȭ.1E8&Y)̔ 1G"2,:A(1 Q,R8&Yw= I)r$ bP塖DˀVyh$EEp &,Y5Ve )z\EBQK\4@ ,0E!PjbbI#DQt4uXb%0eO.6>'yEAhFRp"pъR뢗Mq,haCD(j9Y!Myhn\">yf91ӭh; ?`yz%pT+rjvg#* ͚xmƗgF]}_DII768I9X-׭/T"̕ɯ= d.PGb{\]֨t8\6O DZkw b5OnlTr*Y &{t$EY?@INI)dYpFDp NUU("ΊYgMx?h B{1OlП|;t$Hw@szgQL-۠Q i e f6Xjrn n3D lJ**ؠ@6j_mVNLgIt)W bUPSqEBj''$k_tvr`'; ^I;!wS+&_&(Q]ʻK Д-G%y^]Bt a(8G<+}gEJTGZjSTdN5#>1䬃T ]JՎtU5EJrbVXg1ᢖvIW[Ճyb_C[s+u Íj[j>&cm}E7*._5GS=OkD }-`x V\_c _]ZO[XYz/\DSdJz9$|nN7bl/੨I⾵[zjkvCBpM)*n jav0}#v;-B>ZDlK.eVʫ^5^qtvv-@dRi!Ӿ<'vc - Ѱuާ^ z>w 7sŲmf>S@ߍo㷚h?5ˀ{K6\ZrV $Afn4}X?,ޜ};{_(yye^g7m>Q 3?\3W<̽/w<-3Zw /F-;>7zel@XM% Yn;1FL)DV|7/nΜ :/( faS_bggc L٘hj|+%lR+qM7z#zѣ̍Ill!M@_HOխ]oжo}xןn񁖰R~hSRBj3p.haKAҡkNbJu@?Y 嬅+k>h!#@ B]j -k6gc;F4}IS.ܠ}`lv\z]&ͦE:B2kҿm jwsx$Y,3sYqNQ^8dx7VϡYK \kg} JxN'uw1=w;oCp"Q8aN0UBq_*a Ʉ!2R3F)vt ,Ꭸ-Xa9|)IH @/Ƃ#ÊF? đLֈWc^(!3Xc+ya (E ;҄"V'W kZ_*ȪJΑٸYOa^r0H|}'Sj}7 _)-S(UQRAD47 ? G^6f1yϏ325fzyj&JS- Ljs+L>f[ KpA .u^e \áFqʑuSzN/  QJD ֩'_&ɼa$.mqU`_s!z^2Dٖ `ʹ#22 1s-6i1fy0"HQ\!;=ּ'KXXgd& 4C, F j,sZ&LqK6B21r֊֊Y+t[ E9G1? |G; <ƨSGntgPʒ4nXY>X/ӵ=- Lfu~Ag\9*\#@`cl(9H%X) VH(Ɩ݋?\ڵ]M- n67߯on%?sIEejRd2ejR&äSr=p)ՁR'iEO W}w7͜l}>#ӟ+G$ K$Tc$PXUl+LA [ye TW-{< rO|"5BQi3vBWKBk9cdF+^Ϸ F+aj8vuĢ/-R؞jOZ?!C88XZK~ysW gx:;NvQ% s<,S89PҜhJ^ SjQ  )>=HP6 Z1_o 0TEy޶;.A+ۆ3ZA%:,Ut_jg)*!sڎbju,Z)=Ԙ6$|åu'𠜣H9`|g~tVel̈́]Y υ|Ltw;u #a7ۆ,DOBJ x5P.j^摝6I'4sV7x$~醠?wE&It\:7>`%W^>|^yg=V"ϩQ{S"`Ne +sbhx8fsф)lrUg~() 'qY<\2!WhOQ6zjchan9f.8BWFx.C͇|?#&w*c đD4HOx_C\lzMI5S۲2NJWhTVVp̈և"#CL)XD8ʨ<e!h¼n_& E-Tk <oT `Ty\dikBDdj+l&#U:(VRAӟ0)Jp?e}b)X&\h+YG oK5Qs@1t?1rdm0-my%VKzR4-ud{zR<:)J9<D{Ύ`B.wSR(دX xܕ٨-Tv“lv]y *?>,ܬ{g*uxj{8@t"c2zwnL *gߌ}mneI\uWGYj8Z|mG:RJc8BSFk8qz&E(ԕGz@WZ]O.gt]¿MyYRǑxǩcGvWeLQ^ejFg[ћڙmM7R~Ǐ?`H"rK^UJr Uq~:\T,܉b{6A9;-\O{ CpvCp`p eZqދSy5ݻ0˫\jW Dt.SPcH-J@:8)q pQRmeQ[bɔaT&ՁaBQ17WrVf :'>NdcNr|}|; _ g hP|:Wu 䘪A>`_c7haTXW [? KWû_O?z\>n#9bVOv2zbFpT%Gk'/ &H^A8/t8dp1  n )>:(ֿ+! MktJ:^ ׮!hꗋaO1eD9YߙjU4<8^B1pwz᧚%]ѺS-$s:Sa)r;rjVl!lE<&UL2ٽ*üӛ?T~6Ӈi1gPoGU/Rm/S7/&|q4Zx*>o<[ׂ(\_ !WC[XYFz/\Dd?I:vK DtRgxj=vKo/\DdJڰc8&vK DtRyNnco[hC˯0V=g^ yîUx6`Sbhb% Sٚ}apȥRA×{1~L|]M^f`B1BN :ʴaG3c(eV%Ps".f Nd@Q?",pLA޳=FcSzd:1v} (,{h?SW{ )T+m$GE"@=4<Π ]O W5-cu`)Nɒ`9R Dr#RYg9Kk'xVeS瓕Rgڗ JʳҊj2})L> +L_ZQ-NVzVTҞ*gA%UcVD>UҰ-ڔ[M v~)xp0+3 d 5 +Rg/B(݇s!WJyST 1{##+4󬘊v#Pʮ37Y)Est))t'Qa{&D[*^A_0!uTO!L}U'֦b7g D?ߦ׮T|*y h/oo`~҉Xz>fV;ђuun]ĵ9X" Wo_A8~4#I"&?/@UTLS<+>q,Slҵ.E8,+KjB+thSiaή+P~THYvB!pJv ,n9gggZ|O3cZ12ˬϩ?n(t,,9NJ7Q}L 3Vh\nbj䧼#R2?MT_Zӓ/=r+͌}8 ~c&2+vznF7=tV.y R,Zt=91q*cOq·O1SgmS&>އAq$drw֙joO~Sf%d/qA:;+wQ 1q\E/fTx6{EYؒ;P>JT}aޞ=gE_I5u<BT72AQ4-ˑidV>#1}r:dq%P8T_,SPʓs>b Ai tJu?JJ9ͳRoaXi^4n4JEYﻅn2$\y<-%Z9֫๣ XνH@XY8CJZ˄fcƿ]?,^`fF֔ڄ5w!nkeO3Gtd}Pji|U vnPk^OS4%^noy=f:-\h(TiRzdW!K˔P ,b3^KI:y\.AÎzN:j#Xo'礛VTj8}d"I LoT_}:ON7.f/_);'+Wbc0ӏ PFMRğy2e˺}fp"!qg q:IOW ySy-b\ &n!rKcsXU9/j>BhTLr v37ӄsU/NXY\yH(<5 d1hK#R)s-_&?./( .rT(LW>2E7e*4?ڪ:=ӥd*|ۏ!WRƈ_ ]]ag\sD>}>Jw80Q(kB?Ytf>>!CY B&Ε[5_/ >|*ӉG?L&PEkC%ӵ]#ŃίߚȆ".>tһ7L!P * B>ǚ/d Yzȩy^\hBv^ѳ<8K9%Fq O$=pPKBV&BTDGv残h4Bdrg鑪޳I VDPuLS>bq*}K" W#sfXiPK鑶TBqlwz-: y,l8G/`HfG|ZWP)$cj)ƣ莾q(fo\;s"\W߇[ '7K9<1@J(,~\hȂZ~Ts#Lc ( ݘK&O3,KYfh ހ4͸x{KG^^t^ XXKphih(<㩊fPްI.vy̹Է46I6&m:ini\s9$[3ԣ;>զv֮t$Ǟт'כzblLx>lF\S D0=;|jD+2Y0.@AFÀ+$RŚgZm&AwڔA@lGxΓHpetI7t>}JPbUFPLp[P}^&Lyn2*0 08H!HPĘW%1(Ѣ}uP2]ԩ`ׇ,ڇs^:mYYpR\H %ck !D|@:(h(, f 7g׽"7Ṟv׽B*/d֋&WZShnNV ^9eHB\{_oW}5Vkz6Dq?=WRYtُx0qwUy^y|HMͯCԕ37jݯju@:dx==|so hq+q2-,=]:Z sm q2h'2ŸP L21y,gPJ@f{:0=V=HmNA(m]7jq%ВRGQxx%\M1x#4X v@Y]) m;k#,W}*N\ȌWoیdZvXDp)-zV΋ٿq;0D|np|5ԥJm tj:DljɅЇdGpjEtN~*WT݉fzH=mR.: = TL 1u1r#]@"s1l%=L֋:-M+:ԏ<)a93ᶜ"`j(hTZ= ct]px5Ֆ-UQt,^CsR bOx[{vSwq0^p@|Ʌ'o8wҙ+uDbUf>A|h~`b2x:ǔo!u&,] V,`"?\+[eI5]wEb4>He87ҕ\ p(u.2J(xIU DiPy"Ղ b~}v~PeviviY׳Px gsT`&R)Tnr ."7^ eIǭQ/[FImS֔LPΏ! <)K%)smqJ#$I6Dz` g#W3J1G)ocWEȴ#:)1ZVt:lO>(?}Շx}U>dZ%2!LrMOũ|Dw+1o(h9rH+*=U1ƏڗsT)'{R 蕭ISHf$jFdCtY&Ϥ0L0VϬMj v*ߛ=ݾ 7{Lԡ\$߈uA /o "P #bRhR1m+FPdZŹ9|s ]omKwgel'Y}W^^k 3>>]OF*\/w뛏~)o}y Bh*Xɏ4U[䯟m,Rh'Ao 'O+?h= 7g2vzr³=/jb,xf &P$A.ߧ?tޭp;uH܋A|R>رb {[N+ GՔI;6i#(PS:ׁ$ 2 >O]3~.oOw7{]g/k'uL:.D=JIs/DqDy9iArlZ=$ݪd\2?d7Cj^Vn/)_/uxyk^YK=gOooܳgѭ]Azt(03n.nMƟv=FYVꬄOaU!fN\Ƃ{S `al{}{*oxN: o@C_wái%GDR=rA[ؓ:I}E_j 5jy_lpP[j:ϞImZ״!@lh7}\ec6,5ca}z{i1h S-w.]M$c Sh![Zf2\xO"O/ŇWUgUc|٢$ڳj>mǴ%?Z~LɷVh!+hZe s]y7M6z\@']tWVyLD6Br`M>׈w3i?wA trŻM0mb}{D6Brn`*1+"50z^iG/jOy^:N7!nU\14ɵ.?HrCkO -f?+fhC&gq$M~eɦ4JAl1e33(0ʐC@6x4оÞaAԆc'(J̓F^-[fc}66L r,ӳO\RUhc\Em!C8ҮҪWWlʦu <\,no? WׯwZ9z.FpC>ͥ6՜I6.ۜd9kWFHXR@bYZA]0ÜB 1D FN6,EYd/@L\Px 7)gUU2g+WXLe%s3 J*"4 4Un*is>߄&le+G!Y&\L|[4"E_jrPTҖXmKgjԖd9zT+ ^ AUYЉZ RH3GnEiL68L ͂ g \,t5yp1_ʢ$*nwdl)k)})ʺ.$3(Qz4z c"oۑOԈ/9db#jE 6)$E0ID,Q<Ϟ'!! iiHD Ϟ'L*V>(ѨwCUpf&Ӑb5XoO{Ns'{OɃa< 2; ?)_T X؆rc S)(31TDAlƩ`OPNqF+Ml}* :oEi7ߟgQN?߼[jVFج.ɽ9YF+R]t ~(g(=ԻifxMitFOq^24b`DPp/<䕰?ϗK/,s:imM{z|*)ٴXx-8a8 qfW=۔V*C61앞]WKLfGÑ$ZM!Yw5IaIh>d+ aw"N0ObFSE&%y\P8R`AUp&I*,]Y s)6ht=IRZtjXy_F+]]jn+v5m R^׃x, bmF{nvM/ۄ#߹ ocq< =ɔGXjkʂBGˀ.U*jV!֊gj-6d~#z 5lIacJIYإP ;29rնѣ1>jv[ QmRQLl/ڤ<AD1oxcB\)rcR+YX~)/2m v6a" YzYݭI̓W3D!y9*f\@XJƤ,b.W _Q߷nj怾92.@sl&wΙxM<3"Xo ˆZ,k+[VK4BJ9Q"ۚ0vڑE^; l]j\oTz}Kك /ZK[ߜrqsuY9E~iزE'aUç w~~ ?U>OaDV/{/ KA8`FiC-9j8ٺ먹 4 ٸN4)@2Q#zpZSWKȄA2ܔ! V\%JG'|ITIirNIUTPmIlevؒi<:CtؐxJ/I})(=hbbR0 bbC#@G4J)їRK]QD)%FjAFQz(XJ{mPgR FǃF)4h|)Y%6qqt(T4TyT$Fjaї6J)їRxߦ}i#5 }KXe[8>v=EĐ] :/2]‘SHL> QK[tZ* +ͣB7j6^k|C`P[ FoM o~ LQ@$]QA|e))v6[!)SZ킄H 6e#daܚh~! O/b'`29%|-%0N[T:0QJ%TVhWBi+Y2sŒ,$n1\]ƺXZ˘w!1s[qV7v?6%:>(ݽwR?<<<uܕ4‰9m@)IpZRW^_:b2\4'?1(-/}Mg*6M-wVAA!o&>MGב 3oȤyꛩ])]U>L'MJİ-L͢~*>A]waZ=(rSǫjWiqUmjTw(RWۏ:YTwRF7 SwSƽVg.gpm>-8ZD&Fϗ_>³Mo}F%JX-@9dw< 5*A@R(9/3lP=RG˃?<1]'R=ket` T-"M|sk} +K-vr1YZ'x^ $+-fX 1١: A-!P*bqJ ( VG4J11Gl /I})`tSdPJ&;z(}IMpDTI4~/ƥ!.}YMAG)4iJUJK@)&FjD<PJM"qJIZ:RLYL0pQ?֯0(l%m(cꤎ jO+ Bjz  $i19K+ (Vdw݄r<|C` dEagCZ«e-BW( b_Ly]l$DGHEaGŭ枋ŽeT_fb@{,6ޓqcW2T7dKUEʚVg0}[Rf-'-wu||wF_t!4̾y Wq zҷdV zr΃| ::CQ  Ar"7_y^H".c%_8+ X>?V\xKU‡ A8;!U0i. Z}pc . 4'[-km@/_8ժ =~nz;3:%ȥzj鼽Z}]kg;`B$`rwI151y<|*<77Ш&Q}Q/rs^hmJr[!/ЪКq; PD;|XYSPI_(m'sWx$y!#Q@U=ƹ=$5\Y{"FWTřMYñdk0: kQ [' TYTܜƅv=fD4 d ꄷxT`d:-ڋl&b cHC9Z:E7X$%Gs+\GWiF<'8k](-s (ѽsOLӈ~,hw*J濒hiw.OָK # Y $X!u^hA=ujs\ #$ Ø^Z E`(I1P@KN Un)Kʙ$2JRKPJenR"K- dDqr@IG s&gZ#~L]24]akF#,"iRx` \^2-InҢ9Fzl&P\$z؄ޤ:4d.#C^ߝ7ŧ†Ƒz ZUyYyZ)'~ҭcs0 v&4 _-\.Cݻ3yI!k߾:CMX,Ww1>3i-=~NoN.Wl1J콏CcMI+g wH@͉FSgg뼣_@C/5H)ىI|&{iX3/ k)˅J[M^xma=J!Rxq/U&d\{Vg{Han wVb^$h%-XU%e*ؓgkTHySQI"bub$"Pi#HGE/0$pd&TM"n6wv6|w~想Ws7 D-,O(tydT,jiTϛ" [·ۦjqrZ|Tj_8ԗݿ vGAqjm2;{+_ g#_AVDWPKt(B7bϊO)-S8 Np\vA%yhh>^[_TcOZQSKU4h9ddIOrJTK9z Tt ˼4e5 *z :˪bA.;vݦ%i [.o~v6Be4!n\_g6Ga7+XmMdw œ*Oz*󆃓!9b6RP3)ip[^A-E8 QznH~q{=\G<^t?\W}wźMRo|77 - *ףB 'd|v&_WP-X'x'RLI880ڢ=ݟ&nhug;S.ܪ bGhLjӵĺ)&hRi#:k4n.j k{c/Inv*r_CU- ՒUApaڊG4vFCb-6dl<ޥ0 F`JsvpK:ILbIc; &ŒNoȒN WZ"jm4\]jI}Z/SOABv&<1B{Bzj8FU?gko E@+6 p+-.ph¯jפ DXħ"q\;)_*eFYȢ2ّ5JNQVI} (9˚>ȶoVt~FȿoaI! o6S w vfhLLϡeCέri"`\b{"F+rřM-Wñk0:)WlQ-<U&]$H1&l,nVٹքEMY\O!˛?-厧]kBe'eݕĜ}(?ТTqE+i X9!Fu݆[0ft.K VBix>' GRy 2H,M^1.e^H8'*u@!.* Q[9Dtt'aWaĊ A )\R?zM*\ʂR_WR+W0NP i\( βaW3P;9s]߹*C'`6?lQ:Q$SogIFPϛqx!gUb}p ۅ]|##YzVjV{V#VPZ;:"?}맮]_?eB6+buAѲ|;'MUr!}꒨2M9>jGS&^ǬDc N7n/NZ 6?,jΗ&MK>.~El q)lu]b4z֟G/)Nssawz؅i:iGה6&'w=38'?dzGctD4u4ӇMFKAG'w|k`pg!xcTٮyEk7*_O`tt߄ {|ˣ =,=sL<)G%ܷ86;CsHt p֛wFVIZh&:욘h85!T*SFk֌$\UkwK}bow<++]roKxmHctʸQӨQ rɄs>|WV' $K@F3*C/0gTwk `/lHω79sn2<4WT9 HT34Z8GqA2v)њc>nB, #Bz l >=|_aVnU ׭iqNأMjgm0a"1R5@-i5({MPA_v:%θ_WY鼽Z:tq)~d q|[l}Փ߾NkCZRspgɲ+Xe;9/I0fE{oŒ?.wxb8R+ʶ%G׍Fwms*QeG"Q Xk )PtT =Iq[!AB.gַAU |w1Js> ~nĸOyYN|zvH /?8U,* o؛p6+x= WNnqT{&3³_bTLAo77gAyW{t{'VQDэͿ_Vxl,lcÉ/yqwcʼnOk!Bn'ǜ}sATFoփliGn(Uh iv>חOsܦoDs!)^lѝڳM೭-. [[z;x `tWej@jIx/ ӽQf c Pwe{lWsiRY׾WML2&svȤ:bġ]/85D1m((/fy6PWGC^t:yЌJcY3%p ==(}h}5ƇoM"fL9%`Bz ,ESb`2u 'iV0-㄃R 7AJKUΥ10.@bL5E%入^+xV!M4rDcǚ(az Ifh@GWD ҳ;G^s-nZə!X Ht`ap:V&We>m:ٍ =)IFp|~KS3K.`5ג* 2QkP֟.̘+k>hh@ *=UY TkD KM&E 5Ī#@FkW]HZO|E `/"\Ua@Xhg/@[@rhA6T[,g MQcm2w.?AHQQ (n[Oc44eᓴ-5rxP=V 1і]j*ABbDbD noWJX\t hB<'C_'wUǧm"nf g ˎDl1 fϵ2>wo~|SH =0 F.CzTSbT E7d./[Dߖb'zއhepyO@ۏhϽ@3$y@k3 x) B #0DKlkSO*ڃz^KYd6kAz|4p ceABg{4~܆_~wnOcni1aj2&oi396Zwۛ|ό/U#07 |pn6éF c&cKtm>7mp`] {yqtJazuH %euK&͚5 G{G bѬVT=7TKB=p9=VoKe/#ƣeե0^]L\C"W_agwU; ~"Gk Mۍ`<.?hF>ʏIl*q" )&ej0*} \ibG*V!Jz*QqPhEb7U~ r W\[?ZMV3hcZ=f&S7`@ =|y'hdtQ.0JEAxCqOG &jm-nijrRjLctզ3t7϶ S^*(!]f7tnj68,|Hw0քf4z&sڱ@?\ZO0ݏYz0&BeY74-Q(E=/2 r4`m}΄vt~%8UML~yʠ^t|p?w2ž?Bč^Z1^XJyViȅX˝/׀@G+Wn'4ނ @* #Dlkt@0ʆ[7ir: ndˇŞ[m T1MG%B/^фsvŝV0 ꨻uW?R'fw?z\}O4oiV;"UolIi'34jzno ]rGoEBti\Fn::%zk?]r֓z:&?}ڛ5EQAƨQkԳ< F|uRO[ZcQ;ܤm6 a0I,-$ixlEhZ"iI[]d6?uuVQ׵ ^K8FG<ڮ1JgB_B/r2!"\8[=Ν(ERFo;ўAN6y6%V]'ó`I'~gHO|x*pFw׶ܾfH?%$!lQ6*B_[)WF[ 9]HA͍,C"팓AS%VKK{#UXe!8O e!'DJoZe|n_ 6ׂc=0JU3HEvPe=.?oT"r)kbg<1l a ~DM$M]}Yও e+(8Ij!*Zs][o+F^t^fp.8O W[,)|I[d[NL(A0زD_݋],7!*7~rJsb/l{kJU,k{%1qɇl)ؽؠl|); UQ(!f 6J3 6[@kw֒}^*5_H }ҍ6?W޲7%C?ߗ!!:M_@8.RYd6ͫt5dI{gxIBiXt/K)>M~Oft))@rKzdf% I *jW} +ǒٲz#g[CYL },uoݛ>yzԐ̰i М N~n:b n%.\#œ$ekm ^`jޛFOc3Y|Og려4(iAH胰1f;W)QT!TFh p I8&Jt2Y%RppPj!-S8-ƟfS%Nu0娨T A5gc@#Tྸdb^WbFܓΐΪ0VDMJ>t|?lK s.UΧRgw 6bآ|BJH_IW4f w?0hvWq\A&"eL^rRn$/G眀:Aev`IRB#9ỦBUٔW2Ft/m)g%D[uO#S,D>8g50B '>?-AM Щ4߬Z iK^.Gk#GJK1&$Ud-6B998Exi;8,>o+\(K,`psdvl^2~:aK:0BLvX"bal$!l;`(f-8Tf'03ؚN FO stds(C;3 %IAF2$ ($F盽c=\%ʫ\tIkŝhKOEǝ>9/kMoγ-)@}Kmt\x%cx5˿*=GƋ$2ʚPQ ͸*K`;>9D"mL1WdJ(Au+2֫zOn}~+6YAbF~To8gEC9:K48䱮ik_ aw 3짅jIy3e?-S 9 ]hiCN&+t&iVA0%2Pi~ݻMk4?莱|JѾpO#fΠ`#bV›mK88οmaJ|>-Q-kH$t<"t4! ١͚LcHw>Ţ x7?DE]o?)<=: OokF D#2raF4&OHȅ>FԲ{8$JULݏ1A6@9?ޥQ_?!($ 7 F 8qq"i$֪#e↫G q4TdRMstMLvhy*_sZT;bu{h*e4}M/#BdL26ER"kaϰ=OŧQ}M&yQg\1;iRz:!{?Q"|n,-?82 e)oVI:s%X2/niS :sH.ՉAd˻*^蔱T]˞RscM>\uU"*ҺZI f%#j-L{|jm,3\4Ak }jJr5[\;dÂݏdjWJ,EPWIC&)Zd?o-G` 㔬KN[I Zt @{j`1FaO plݱ{FAjuRnɯ#s^"dPO f?V;Opv}:Ȇp:j4Zv:^p&)8Œrh0mSyp3$uyb\ Fb7EmY^\^]P7{n_-nrY~~] H*6V?HٽњcF{%se߁s d>|3MzLjkMN8Yٜ[!]iК,3U@KaR>;@])~5I}#9&Wl%߈f b2䏛ɄcϘMɽBZ44}0ql◫}|-PZm_̇p Kxs634Wo>b$Pʂ{s\̛;&5"cUX[)f-׻ @e;hIS9״0R1ad=oS~2@۸n/H'Ug  FXؼdtrVQoԣ[hzWDV)qR{ @_Y&<+MRFlj +v.#uFgp}9[|^N2$Zۀ,߾]E[-_yZ2+CpB# MC~SWuGwm#Iõ/ad~ѯYҐxqE٦dYLRv2d˦ɮO}M8?<0&[ie85I?яQ}/vr Wv'롭G4yrۯ>cEүE%6o#?in# SRc7yyBi_\ͦ,,]zxx30mͯgA*go9 `L \ؚ?6Q󩣦@7hǾޚ3 (^ަwQ$X_}sB \AãQh}汌u5JjZUm@BdAeM9Y Ω0\᳟]O@E1j~_ߜm} ]Q< tofZ~ Px515۴Sѕ!;Q]P/kݿ׀Cmu}6ԳJΝEÔYO3>E;)g8_k~>ݺC b/Gn_$l!)eNƖk󏣛U`:h;> r5ش738vL0CStcN|* NJzs2Owh칞fbu^s]]8Fj0գ냦mpz%'lk`Y0΂ܭN 95RF"RQGXp,HCDmy+ xlZRp3@&lpGyWƷ=@ qa~=.T&_l00 0d U 9 QhxhyX,y]jptco 5[{ة=q^H@*l)%@ !:d\!AQ{ C>]vW%ϧ :`t򼢪Lc#eB#.S@.atau/0)C)༦/%2!I3QYN kĄ0-GZ cIkC />.v.U"i*| C T`u?KD ((Y0}.6>+Ƿݞw"9𗠹`r@ q9n'$\;k|;q*)R(cXf-ҷ\| #{fiH1~ǗH^T %AF< /9f}35?Mc|TȞФLE8(80>(EPo" z^S<\̇A'9qBr e5!Rd>h(t_KIҗch{¾mx2[W`E r= 1"EqdL_cJ`3AWГTomSцu:!͌,)9[iv%@5.L2sasO;KFR0AeAҾ7Bßlt|V܈bFB 0 "^'f-FJf`~fG|g6 *?QE)IIBF:BK|"4.=|J¿v)CrBVCےq '((QKk&_p9~7W\@(GQBg6g1=]Wf#Cgx^4) 7ЏJ|T>[{J:ܚWnsCrCanZ1!GtiC&epP?@7II()F !(Cyo拀rВ*EL;<۳j]{J , p,$xu_u}Nl0x]v4.ݧ*y6d2PɈ 9k' JɹNjϴ!7""}Y ;s3AC$]0`~re\$oե( cΧ P  ,XpĉKB~o8=5iPW;*0̫ODQ / }txGc !Y^aOqzv_e Dr3Q#" z4~-@Jw?"5:,.}ij>s>P w?S\l [?)lt2/a狰$v/)wow21/'} 0Yi&<e)È~.{2<_A{ND:Kɤm.b^יߵ~?oɻ4f!ߜ6:`=Ӡ9~֠35MWY}*?+m %oAv2׆hP)/~ߩ`/]epY=RC# 4{h:̣Ɏ)_ޮ B=0)m]\F%..&qƀ)>9h;q [~8}yw_9Uv.'=?cY/ĭZy>BڸVZpq;j=kµyʒ5ıK,W-FK3ƍfxfbdԞɪh>BeP0RʛƘ8zwl󸭦grfWx7LH&Y譖>A/qBKD6NJݛ]Kb3پ̤]KY_wXdW,++n LXG v%\xL^l1TVR  _Y68w6g2>lho"nόC?:3Ø~)Pb#t!֩6sN56ukO-ݺ@6z ;fԻ|%[fIM6Zh+$ r"sTiK=klAtjVm1&(oRcU$48B"DXfQBS1%6"O 8XoM;xsnh ـ?^<¸õ< @bڥ~!Z-UXY|b\R:x/.NdkW>d.~wu!~O-#[ّI_ut?~+xZr>^Y2~dv2 Q{4D~![p{_:HoQ,^fKxOêOx‰enչ[{|Ra"BPI = 1b qk,I\d:i`aa+༟F7OZŴgzȤYDt'1&"ݱ{ 8\HE]չTK%BuslTMPAG[z;~*gGuغN^BNnuq:F_u5Kã:ˁQ{(ÉB} %R:KI8өn)^vx ^==^W{7. ?>hu;ZnͭTl5'cLrbԘZo @ǜF ⰐD QT"-Qc ȭj[/@O`nsa1өRik"ę a8V:2OOx+a9OTA"Mj)uBsqqU&[#ùBQLAJ0C\b|ZS8 /%הjUHV[g-=%uz>客i&_Ue!(# (V!JDv]cW)цa. 2'ڠꪤB2ш{yD wdIY<}f]Eg) VO]f*?Ɓ DDdsCqE c2.E 6_jԼYWNN1?WS(cl ;j+2XHhStu=uU)t%ݳɧYc}۵^q4^̯o%.bX,g 3>vONF ?_\b ,c^a}!}zPo R˟;/zx[d~s߂ξȘ}rBa #–mc:HV \(`MWLoP<IJŹx5Rq2ޕ5q$鿂ˮ'Pu(^1b<'uJc@Pv}l Xn\$5y|GUeodF!^[ZaH[,MJq_WI3yoپS#4֖̕řw,D%噃r?˿?}ڗȰ$lM)DY9O HUBX(;씵ٯe YpoRv(c~[/:4:Ʉ3jһˈ^P#jGdGR,ŴUJǼ闆t"V'kl}>?N2ݎ \2 [D zV:Ÿ_Fׂd7 DC3|`R e)%uPJhtR7Ԛ{L`8 z!>qe*',X)Tڅw8{ "'܀[1*A}gwj|gʛg odRr#v5薥n2hlEn¬Rɓ璱>Vxn*)'b10Y ~3ъTZW x%ZkX}0d唳;S Bjn nPzkZ68zw3&8kcē):QP (浣"o/= e',AhUv.ӺANH&n4N Idd(SQC0>8$bZ- !GTN;P_'|7271wb/2/ˁBgςKp';kOPd2d5i,If)IY Nrv ,!6($VK]ZPvD!%Ed>f憞}'Up>OpDhox:ɈT>RhP!Y]**-V\ICC(,xגwݶ %@QUKS1G#)sMA5$;zޣnҍv0-%8LJJ"ֆwˌQ`o>'ʠ!DǢ{I>P-;9(Λ QpP*jNy^PL |) ABiDcQAGSNJgŹ#("O 0J [X Q[QK_// $*]"G)x.Q$9 18upS-WLⅦq*ﲇzyHSd{&SY"{ӷf5H{.ٺ֊ӌoK\Kz:}f)}{ h"5/t4/hйc0gl<_>O9 &mv],DZ2}>q쓄OmjD5TMV!NM, }> jwg߫>]ZdA}7@`O2[VgSW?չȾ0$lgv)U* R6Sdp.Z.쫧%ox3iaIY*%D۴-eMVgr~JA%" \H)QyJHdqQepWdXrOѰakR]T%JWQ}H5uD!xJ!_^xJ'T'r{8lba&m]8%JWQ}HuF)K(݇t5էTsއR*PJs%{R*P:4F)'e(~v( &P ){:ƥRTe(Em{(]E"@G.DSH\3H7@+j]Y 2J™d8s2Zͨ4hk8I:͢cSeSTZ\NmEXV11)o(  j*vPkq. }tq`_msn̗6w닜M`{ s4?.-Ig#*ϭ.&Px} &(*&^tkW6~'\pDq%[ZnXܰS11I[e} [1&D|,/swDD|JFWxF ]%[L r0.՛{k$֗GOuF.Ghz{5ɾ7.]t|pY-e':/i}ᄊgAx_%Gt3tA>19^#8("yZ>l5.z} !C06u[[@w4nX6mWZ.!:Fuuf[#JD80r-Z4NxQAO5gL5j1JHR 2*ӄi x +)d0FJTOZhE<p9;e5ؤb/PDJ'TckCG)/x޻dJ-Qj\R,,A~tէT&R|Kܤtj.ɱa,e|S s EbBƄT\D` DSV't0ƚB1T&ckkէT+ft!Ju?apQ 2ތdT7W{WSzHbvaN; XZ 6O Rkݕŷl]j;?VQە& mk'(t ݵ ?favSv'7ޤ`@q66ZX V J)^xnʯq8$V8+kRX ƫ{j˪VP~6\&q~)g_'ŅܞIcNw~sGM4ԄћI󩾗ME 8v8y}w +sZ6T Q 19QZ!u$jԒpG# @YOZgu/=V2ծ twB$| sfO%A% W}v8~?0mao d8aOR(ZX`^;܂EH=5 8M^ |I*iN}r"B rb #ЄX%m?C.IQH>:uT:[cً8;'M"X\$c8ac4 [H=Hja]OI(m>7}Fzs0;˟RM4RuDF:G}9AA(0ѭ{9m<ux5JF,!>K:'y~E]1aPȲhv}0w.xk!^x;6vm\odӫa"U.3PY VXZ@eI Z m!wR#W7% ea?vc^w2 *^hT@Wk7 ɭЄdx L*K!H&3J@!sNd6S!pέsׇA6A[XDw'T'is*2ΌrA6MXGn BF:'yn(6 N1VŠiTQj3>>ą+qOwB_ݪAXd~Dʛ mPOq*gQ|-?ƕ]Qre%{_???CwD-YƂy|{ps=RSlpMy \q$T}jn%Dn]ab2wm{>= =[*/_zm/DJU*OS.Hu`VL4S IIIN ECd$:!,"eTy'X{Q(Z}i | ilD>+< ~ON fD@]S3ÈN3@'"2c2)PNDiJRTA34 5 Lm @#cs@nW$sZRJ P(k93*)ڦQJފ~6HEFx)C$#z#$ .xS 0<`wK9B>晌\693Bo=èmjv6(8sST(͐ˆgiAshWM?ϔVFF1Y(>x]yb;с̶{ l0faɺPlq=_^ۋ)q"폘_Q;~| Ϳo?;,ɀڳ$"H€צ˓r=&؎=2?فgTIdK,T| F&#&=EDq Ftӊxw&S6=[hcb@^bjV"PNRcD-p0T{h65~RN뿹ْT %hɤ.0FW׋wWg`*%^uFm*qvR)cv%Ejk'uN:ӏ{1ůn.[.|NP˥^Xg+JP8U<(rFN'za <ǛʩwR.;xG$[x򧗿^όu7[j  9X{HwX_CwߎiVLC>8{!>J~Pu֮1;0BJrD:t:JN_QA}YJSe)VFp@UK\@~-g'N7ݥvS5G6c9(`Z~8`J댣ƍ>$$,$D@h:rQYE=g- q~PkjMz3?h'nh.M˧;v0ZCPhj C8{ '̏LBG>;}͜s` M\΍7:w@©mSf  9u=- ,F=jzj'au\tP3%AF `6 DOlB^\֠.bqFH@ҔY@Y'|,SбћMTI=JXN!ҭGK#ZODnmp7΢Q<%k -cn8a5B[VK:nnmp7΢R-KW۩k<醴e 5wu{JlQfU8qЕ[|EN+_qcWW8fs\|1q/q,YB̓dp| Sb6Zgަ4[:GS*KyÎtjy\BocOT A}Y14z9x%FDws/yv&4y,l6x?!1r87?׽C{YW;Nfwqev"'d!]?~Ȝ\[0$WEU;F.}޾A(ͮMnL6A@IF[% JRI7I객r7C7ˬGoa I$.EQ$hx[$$uJ$ >NI$p4|b%\쇉եX | URnxĖƍM'^vOaLi'S$Y\ldR>x3nᦇ-s:q%~Jeb렟=[fvOoo8$l:VFH0+Wvӹץ˅waP2mԛssss\jPKYT̰"Wx&3!3`y2e$,pznHk4%n?oJ*ÿњ,.!v*#3z*^ēHiftS<zGfL3zz⑎amZyac="JDRsĵ(Rd\@ $EPLSJ4KUiwI}UoE@RiY^ sSCΘ U2j{m y2(<Ւr.ǏٝŋB;8]"-2,(G؁($iA s[hF Rt'C X%p4fp_9,AHy>Ю;|a,*]$T` S2:㜩P2_p#LRjJx4g)$j&k@bdP ~dgV&&E!K1@hSM^(],)DDž$HT@%!Ȍ4qK0 ^/T6D?zw?+{;8MiL+@^ðk-~}{_xu4d^S`~]p(;:{"ǎcGq:lHxfh:Zit} .]CV$,M6ay(:؅cgZ#&jYyݧs}|ͨsN)+Da%FPȅBP)+6"(F5 &OBsZUtf tgX{=Կ\/[WHCWCsP>9J0rfvE *m xjF@(〗nHЭJox0Pqvc>oZyMІͣ̓(>J뀃.8B"s3.9W *Œ-/`]/228׭hb̨XS=,"  `zhr0TJ&MdgHLyYp0'2*JEk&JGFWך2*8?5L%18Psu8@`ꖠ|hr $8Ab LeDk,Pf)H%PZJ|q,5Ġ!tHVFQ)(23GZFAZŃvյD(I㖗ʼn1Ad9Bsu]ggE 8ZϹh]SOރ޽[/t t"Ѓk1( $ͥsets3 cz0H:$uX4` yw&wyw;-QTQݤlDJ5>tXX)fZϹT8*7ȹؼ3̓(Zshi+!8g9P Vf@%(s/r.݊V@1xJ 9ڜ Tk-+m+%˜ǥT>2ڊEr"9ޓ 8^ޣU_HE/ C' 9V4A>\%g4vF|ܞiU;i{f0iP^?4a'=[gsU(逫ߗ9c)Q׉1my1jmq荔SW,b+A7\b 8)WEGΑcу{zPFָ3A'rNr] ZcFe]Ȥ} &c=J`Ӣޒ<]PN Wi!Ilt`G8h1pFMoe4[Z8< ~WRp\HqbA^\ uuWwܶԡtd^VķX\HHk%xtzuvoނة5@#EN~lBF36ΎAX) G4yNYZy {J-~>7Nj` o0u˼gI!+[v'g2]6$ `v!Yh)s `{H8=IcRd4FKF4vXn ;:Y֢-)W1#FLATo4BCHQR,PT>(:_FQrT#j 0)&i^ 04GyK̐M^ &ٔ3)\ֿ*λzB(~la;ܻ% ?]"W~Ӆ؜9x|)sf2Mf1HI_<r9֍#nP? \Wv0zNNQ@2_KYyVy#w_)oC#hը-d*>|=|cכ2d2bqv`;Xl4G,tnaRv2Oy<|ҐSVCpKQmn獃^Lhw5/bwoƐxuv8!sByrQFPa*Y.ObӲ ld=CoSc5FUf󼩄\/W3XDř+^bD&m,Н}%`b$U7?!gr21bǨO}wlӸ2D3K̯2i(%\d}uQV&ke%U7}I ?"7SJB 'kNiں0C`ܗz2X_ 2k$ LVL/vaHg[B" D0zgI o8H褔RC$%<+D'8$ %2kYXT44@rjbl R]Lauw=.*d6JD br-cr_\uO)yicNw7ܓ61gsfUabIHuJ>x_ QTa-\I0긢N> `i8&D r,-Q c J3Xj yECn]lYެ&qMJu$Eh^rٸ\v?_c UّB)y!t: Rc%p5Arω" Q(D2%#`$n_zE;7)7v%pڲR8)Zr+,S-'p0t"s588!e2HC]$p@A1Dw_:d(8pb/UOB4)ͤQKcp0-( =gmVX07t߿ Ot_hX9#hfM5w᭷SG DUap0):wwtgT`!")dvtt~y49FfӽqY˭,Q;@ |Ƌ&}$,%2 L)2Ijhhmm>8qSTUϞ)T܆#P.TRo\G*AnާDO4_ʫ<' b Kl4zb.7bZ b;p2)jsEYR^z\66nq,gݚXLym몇d4G3ǜL F:-L G@$-̥AՅAdw&^R_G׸ `)vxtsiKJDYi\ )$>(폶0">}HxZ$r[e%rFېЁy蕳%kl449:b Gd0u f0ꖶ Q[}qKW(0dOLYH3c=x]8o8v@ᒸ Q^YN}[CQuP1cp L@]ę$н~p!|xV>fii,h(JV"hV:on`t_|(d`wpq=x{k'*}Ap6`ϐ|",X!X-줙jCpeA$dC!Y,\ZbscQD6Z-uIm;ip_V9]mp,lM~}JIT[+Z&{-r 8rБȠU/mPP`=({\}eDr;oFUjS3܋z"s ˥ro2iܨ f9OfoK YMcc3 bwY@A"YH/^rɇ>8=no1mѺhV$2:"mGwd\Up*Pt[kjG DA2ޯz@97W #?z,#H-0.zډb[u {/N20~7t˨uדIJtꅚߦjFqAV/ϻL(Xi/9͝bl{bG r4bn ,D/ZȜ_F֛zr3̪ Gf0 %5`-Ԋ[qQE`G\vL~32kq%=jXMKA` 3QAlG&WC=qz/Gxjm{A)@=#7y 亠^_ S$ Xw;nO?n&?.]m2Ԍښ;'ՒsvX*2&L6ϻ=-". ')x]O'U/A=|n ,hH X؝~957B">E $}Y_8DԋeN(Eh|H?S$2;\ mZ&k{ )qɠQH;W0l/*cY=|py|S~KsSTߧO=^g[vw*Zn7H!/>;~>J?O~1~fB\rEeآH!dv*lU\SZ￾ :Eм\_}|I8C @q9?Sg7?{OƑ_!e7oT߇, xƈ쇗ŠO2E$uAV )i$QP3DxtUȕ@ev4h![BvDWWo2"O!ѬwqpzoCiE7O |h8:wGD_iP^(>sO^7zLPC8Ō Y}'Xψr!*SD>WdlQ NȄ))fd^2vW)G'݉kL@ϹT& lɪ! Juq b޶jxƥCětz'L&P&{IV(!&׬Fc ^ 6)buQdvg1%"ڞ[MPJ%\_L [}կ. R@G%Y@%C=uV&T c%zH-;oh6/StD#UHw&%Do=' (̩z5ro) ~U"-V/`/ 31f87NJ5&gܩJHɾ+nQq&- *3݃Td2^$dL|yĻ߀7S$|?WL[.bf&co5NO\]ޫ5_>(EhpE#%fHzkOŸ?YG&&Md)8 0ɜg/:*,~_3w{_!^q:)e>?V҃ |Y3 Q1~l1A h8~'TJdeX?o?53 ݋+fMϽU- :~x_4$1\ ڏP%3Ve䔍Ʋc=Oq,=FҢ~?σm Kq\<, -<*;OIɚ3Mx (b 5/ݢ_&|bxh3{0 7NREbXN ưlomKر6fkM]' p:WRs?Wpzg0Yo[Нv{a{PDlDRe& O "Z8Kp0]/إ[pEh/ l"#s;vpbezbpen06O\;8gK֑O* /΂1鸘EQYwoYl=-RL?=n40 bcA8rHn)8 P%Ad.V%]jPK1fLt>8/jȁ&h,mu74M}eS8t$ԣ0|wݾ[&}qX2 I~c/@ +H 5uߌ0BIY$8xVfnip&fX:{l޼{ :1Ucw:ĽPr /Q_EL]b`XmNs?Iaz6(qdfvz>K:dKA>1,Ϝ}gwc Xp48(9 7h"%ԂaQ]-wz7?6<~t%|9} ۓͱh8w >bM"0(?a g3}3kxF @xw4brzVh#k<@\J%UNfam2R]iEF1ϘoyvB5ЦTp6z(E8LU]иcL}ёz:mO@0iԼRp>$!*ν Vєd 'Y ZCWl,'UYQMJQFYL@.*ABx'//g3WMҕ{AyJp OigmQ?sύQRp/cXKfCm\,ng ao}\sU*\j26{`3Ţ,! OLK&Zʎc"ukvNa\Ç2wGlWSӸ]%_͋,Cc燽_vAO:'e/ta;v#kfwc+_b  ,XCfhsOvbL~/徬jDbL+l\._I1JS83!בK_)ʋ‹,b\q,3I z)P4OFM.\$ZP 1& LWtQCyN,`~9&oJ6W< YBEYU>|$["QbfܗFCTTP LLZPfmViW5+;h#p˟6̇sD5^e?i6NwB͔[{Bnr.tn~qp[i,HR+loz:Y,$V~>^or\l=YWoi`no3}ܶhr5Xv\Q͏ku%+)/Q"az30,x>нIo<޿~9%rԐcʿ{,-pTEZz}bip ~oKd~9 ]'w_9fUoS>3Gj< uOxoYyVYkZt:b)]{+^BITauN/fէxiZWR?|K>42~x5wrp%_G6vxrS y jb|"țx4y;x 8.Z_~{pdilK(5;/kX\.qc=; ׿O&n%&kig/Pɢ5zoYZi.šizI/Dx;^^0' 67ܶ+$߶arfȵ􆒬~мi8{AΎQWQ1fìN+3M|v9ɁBrQ]=zYfճky7װF&W,D]/ͪnJOu MC@@S>Z{ Ԙh7&2տ2UdG-mK,r0Fdf6BBUE֖ko=v^lЃI؆'̏>TV8D$P Ց?儖zW;CiaY]((Ή߂oWZEɥ3Ę} x8OBxDIYWe3x6{mYdlHSo.JN'KMx4rp˪?Ng(θt,qZZϻKR?V,%2A<@^v`Aϓ69s- ڜ=ij_Pg9:fv mgKSDK\99AgX-@>%t=$L4b|ȍԦ\ ҝnn zQr ]=uW0y4$\]΂*^b&zmTa+d@|TXJCR-D<梍ثnmW8q3I` N[C2T.B aQ ]W-SИ_PZr} 54Dc&h" +%XRq8%_ R0aOzԱ pm-ƿϯ!eJCpأ>RyMͦO6c Q(Ga߅QL.VVΩߵr1k]w.wk%~vٛrwA˃B} J4k54?bcvZCwk*'VUxߖ>~R/ ?e|#j9+xn]Sq6z*؄VưC޵6rcٿ" 2yˇ^ mL`Inw$%YYTZ/{:g2vev?"vR:\59}5?l2co/ZAjx\ Ң4+y=nAw<^8a<(|;Oz3g\[|cp0cxw]DXifreYϡv`eQx)?nu =f6( h$}Z[HvvkM0u!biv3O4ڠ$tھIvm)wkSݺcJv ^q8UR֌ 5Wwwp\u^\H (]_-[k|o\j@ShT@ܗmYg"ߖ=YXNY 7؍ήoaRI6f;W[vu_nY>R)PUCl N]hr oC/hKV7x#gB ޓ;m0AKCƈ:OqK4NK*%f1Q@xR JHw{a]*yË<*["YApv J.ǁ)_*rњwwi87M}p+Nbsi ($ EmIB8C6mx[b*$X4ZUHME>|^# J`ቷ ,(Z 0*Al$_|i'?a&O69gftw )RB9HQY}yg E݁mYAf޹WHQ 5(~ E@ ư;]khǫ>6oŷfT(uL7Q {C*x6z ͆{]Fj[XûRj MEev.oE^6Oj{jgs5KӔ璏RPS[}GȧՇS;@=P!Sr!*ivC{FEH-xݗI[E+# B>/ucȹ)/*j!&Ԃ$BQ"S m+m_WWUxSIBFQ~%FљYFI[ %A+=J Ai%y$ÛZF|sB;70$e(R_t˵RX_Q *{R_i*d_j93U0^?˸WO7Ros:y/`rNT2JM~C~ʕUU)BQ1" wZ*H ~V[w |08&* &˜Oiо*/.xb Ri%jndw1F'wL׻S`R8s-O\Q7W=;.zCʩfH`s_?nNu1yKTC,K 8)ќꇧ?YRjTB=dTu()Z fӳ+8^2+G5G2e\L GGE&+q嫘ƣidvQ\Nnn'kɗ;I @ b%3{1ݣ_tFv)rߎl:u$յ!=wi=_sptoL+MkCS+ 6zFljİ ZS&_zsaƷoF.\*CYz8>uV*M c\:e,eUFSeeU*gMPGҠܒԢ0t0-͌\'|S^2}]89%xt5< XN\GޘGL8JPh2Xf[ }4=N7l^uHF&8Ҭ V9q\t`тgQcN"FNAܹjijLv6;Sל g{#m|fWAٲ@KWzqu|\:H9[қ,M#$1STr_|\B"_qd_rw7ƼbOLxIqVrg}bkIף=ߒp4FJĆdJ:Wo&645Ky{aȒ//8,f]A J,IC0AыҊ V1g6PP:4cnkr?>׀akL ʢDgQ ,JRs{((šB뤭AAp>obRQTU$uɗ0"<rW"˽F-,6LuXjc$]J.OK}<Ŋ+2k$AIqYjSz5A0θ)[iMC"XI wNú;[J5.t޺p%H )XQQY9vLop׸`sC (;+AXZ&C%j4RP1}uSLm+ WF %qc=X2K&;J͕KxVt9R ¤ ޵6#ȗBn>d=f0$RK'9ۙWD)NtbFdWm 7oC4)n^G\u,Rrx@Qol~ _Ʌĉ_)ğrh+ 94LDO`!e:&pN gBq&'J,4S&ELTv#Z過cC"Ih60K{xpW܊2$ft8K*F[vAG<먟1ޓruK7`$>!oH0kZlHp)&C`9]ZG>h)*Zc* +9We%MnM*ky%8V8~?*BѺPya,pBB8s<ϹT)X5[ǫTBP{WpNӔޣٶ#"̙q|a8+\0ZnZ}$)8xO!'Go\&!hphϚ$ <4wI)ahс`\ *`o#Ȏx6Ղ;gGvvD+ TjXS:߽[YrcYg48x <6 { KӏstgW ]XW R9֚ʂ%jSfAF}9jSрώ aVʥ׷ntJ_MZ8[I[)./MZM5ٗ f£M0QX0aVZS͹g+=a+b#YDGƲʕ_mޕ>\ex`5Fǀw*}di|+iouTF'~)SP 8 bNgLM.<̋O9lss?`Îd|8ZF m|Do[L0`x$WPf\3Y,9+Y7HqyTSTrOj[ۧ!Ӹϳ+ Hm]H"EyK#穓F7RWT-N&DsB b3!'K`b;4C`pkbKRPƉVş몤&uZE!7H'v6iGٲf'[cI^h#8͉V*EѬ9a@Y0SZ*1Uܺ}-~X-rw]ޤ~mFzH*m|T-_)1Ԭ< bW?x->[\W}_0dk?3_U׋jj|)3B.߹ɗG$߹[ GgI<6L0}Ӿr IZnpTͷhuz;l w 6IF []o洒r:)j$>}Һb+(Y;AEw ܻ֡|c`yMYo^qS4PY_*}rn>z{j5ڶ_#SL3=^Q}\T=^Vqs"]T?ߥ0 }QA#tpۇ& Ih34bD-3Tv.6:0qIU'dB2BE7)s-̔dr#e ̔+!O[,.i`֕puC/S}զ ~ 8*|?nM/ڜ{Gbw IrUTNU7|1VlynHfޑ ?]([/Q}զZz[=ªD5ysM^[??^>̖_ZV|!X""A]kGH5!zMσ:]iHht&*lAH~:p՜ ƫ:J^]exE󲐸 Vg+ܙk)w2ɘ}8< F?{s {Q'h&+~ ;ކ#K9:~)\y[P2l>z D`=ѐp%nL[dx:n:ф8Cz8J i N7> oCCê~l5^Zͯm{[3ۨ¬y;>[\ \uhIY iyKKMam)TY@9P% G"]V+b:@ӓVF4cH3]vmد\p(|g-ͨnYnWӯZW7k5ُZd@2k&ɔWBŮ>_iM:)NOG!˝-ZJ%'jm3@ä @Wmly Ԣ$yEprjkDH%%I'%+-%Vu"D *3™ }F[*8./"{MqyIM!\G Rx*Nvs RсwbĘc jj2y~kEv,WKTˢ 1J{Sx<C-^с Hb/sBzn"`BxM2ČBR;ϭ[٢*rb(g^9!F[V9 Y]1#FȌLm k6-(\w⦐W5Yj*xa\ &7^rPl.ʕSV;2 D6'cހ ŁHU1-. n*&f\Z-8z3cѷhNe8BC3))dA9V9CTg)T+7_Z N2L!b5\hVr*{ye^j6cRbw~TJĜa} ")qc=L,RwXgr_-x`q89n}(l;oG:Hrۑ@ ,/$Rkq X T)!T s(k 8$}  E+@e79hb`^&j]Q|07JQWT4hZ 8;,}"e{ =:;}{+ܿ\/66wQנ?,_<~r$x@T<:򙋱"6 i8L= `t N)0szЩ s( }3Q|7\?=vj'rվZ.wgkF|zK;RňN>_2<錂htrwtQRQ2lz0od%uS7pyFܟjANTGZ*d?Smf#ÇUwŽ5՘lwG(rtQjn }] #6]6ʛ AuMx~K8~SB7m5ſŕw FQj }pMy?tҨčmT+Ҩ%>f[ TE<ۄf>3aSsv^ܗk Y~w?[}ypcm=so2e;rϞľsG˵(Gr^:S{66O|VݚzXN;x⟌O[ MMc澭q0Cz;@?\ izոcϽ~?-~ax42r#@rUMOnpZ, w1{X?k`MyBnQO[KAkԽ]vxͿF:JS隁7hBpq`ODUjN)'O{J!w"ӆm^Ni% JTJBH"jVwY\mⱣ>Xl;ڡV7 4'L#}Ӏ=כpdvUb\%|Z=T*Hu4*yNvRJb$aBFdtϩ]{2 .(P&<[;2\_gsɺ]@;w?fm/]l3[#%[< _W~?ٜҲ2UђJt}_ݭ5x$`ZsPR Qk?Z_0(ǝ|)3 Gd UxsĊ;3!_f# 2u{نvA=Z]g@U'0կ.WQgr%[{!tdnO>9F.Pk@5L8aeb&hc4"-WBͭ=om~i&qe^BElRRKGs`2Qܢʝ .bPJWMb_$.彍$|J sbY)N #3ؠ`\FEz`:IJf汪6l"4W0 ;xw5cK5ČV9)/}-SL#}vPrz&\=55haeW.v]엞*qT>phyȼ yǝG}(=]q`4p?tD*`'#-rg;#{76<(I;{t mT+`" xnm24~)UEudMrIP |'8&eƁNUI?ZS#XO!Пj0I:"=Q FEPaT$1I8z~C54&,AHMn;O.Nl<~X6Bq)A~nO BL']t.ԩi]nM[ M4¦Q;MqxXaE\dGS9Y|GPh˦ Ц`.P{!YQRa ?,{.~+J_%nXys488Y&#UFF/y cN׺"P8b$U%} # 7™l S߻ Q nkܨ?Ʋz]sx/Ίy^- E0t Xq^$X,k-G2W;TK.ψ['zTVL)Ԉ[ JBCFuɹ5)g n map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:02 crc kubenswrapper[4795]: body: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:43.440719698 +0000 UTC m=+6.898751279,LastTimestamp:2026-03-20 17:17:43.440719698 +0000 UTC m=+6.898751279,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.607802 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c370a518bb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:43.44080479 +0000 UTC m=+6.898836361,LastTimestamp:2026-03-20 17:17:43.44080479 +0000 UTC m=+6.898836361,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.616633 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc4ca46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:18:02 crc kubenswrapper[4795]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:18:02 crc kubenswrapper[4795]: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088623686 +0000 UTC m=+15.546655227,LastTimestamp:2026-03-20 17:17:52.088623686 +0000 UTC m=+15.546655227,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.621921 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc568fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088664317 +0000 UTC m=+15.546695858,LastTimestamp:2026-03-20 17:17:52.088664317 +0000 UTC m=+15.546695858,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.627432 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c390dc4ca46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc4ca46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:18:02 crc kubenswrapper[4795]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:18:02 crc kubenswrapper[4795]: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088623686 +0000 UTC m=+15.546655227,LastTimestamp:2026-03-20 17:17:52.093232092 +0000 UTC m=+15.551263643,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.632281 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c390dc568fd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c390dc568fd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.088664317 +0000 UTC m=+15.546695858,LastTimestamp:2026-03-20 17:17:52.093280544 +0000 UTC m=+15.551312105,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.637552 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c3912c2512d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 17:18:02 crc kubenswrapper[4795]: body: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.172347693 +0000 UTC m=+15.630379244,LastTimestamp:2026-03-20 17:17:52.172347693 +0000 UTC m=+15.630379244,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.642766 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3912c30bc9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:52.172395465 +0000 UTC m=+15.630427026,LastTimestamp:2026-03-20 17:17:52.172395465 +0000 UTC m=+15.630427026,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.648586 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c3669c7f9e1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c3669c7f9e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:40.747434465 +0000 UTC m=+4.205466006,LastTimestamp:2026-03-20 17:17:52.388756377 +0000 UTC m=+15.846787908,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.655179 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:02 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c395e623ebd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded Mar 20 17:18:02 crc kubenswrapper[4795]: body: Mar 20 17:18:02 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:53.441119933 +0000 UTC m=+16.899151474,LastTimestamp:2026-03-20 17:17:53.441119933 +0000 UTC m=+16.899151474,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:02 crc kubenswrapper[4795]: > Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.660679 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c395e6341a7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:53.441186215 +0000 UTC m=+16.899217756,LastTimestamp:2026-03-20 17:17:53.441186215 +0000 UTC m=+16.899217756,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.972107 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.972356 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.974034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.974096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.974114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4795]: I0320 17:18:02.975090 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:02 crc kubenswrapper[4795]: E0320 17:18:02.975400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.189260 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.440868 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.440966 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.441035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.441221 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.443672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.443793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.443819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.445289 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 17:18:03 crc kubenswrapper[4795]: I0320 17:18:03.445634 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76" gracePeriod=30 Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.450013 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:03 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26b5046 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:03 crc kubenswrapper[4795]: body: Mar 20 17:18:03 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,LastTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:03 crc kubenswrapper[4795]: > Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.455890 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26c53bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.441001404 +0000 UTC m=+26.899032995,LastTimestamp:2026-03-20 17:18:03.441001404 +0000 UTC m=+26.899032995,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.463415 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb2b27973 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.445598579 +0000 UTC m=+26.903630130,LastTimestamp:2026-03-20 17:18:03.445598579 +0000 UTC m=+26.903630130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.581396 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c35dd9dceef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35dd9dceef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.395860719 +0000 UTC m=+1.853892260,LastTimestamp:2026-03-20 17:18:03.576304096 +0000 UTC m=+27.034335647,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.815807 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c35f05817a1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f05817a1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.710058913 +0000 UTC m=+2.168090494,LastTimestamp:2026-03-20 17:18:03.808653271 +0000 UTC m=+27.266684822,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:03 crc kubenswrapper[4795]: E0320 17:18:03.828309 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c35f1331641\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c35f1331641 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:38.724410945 +0000 UTC m=+2.182442516,LastTimestamp:2026-03-20 17:18:03.821507722 +0000 UTC m=+27.279539303,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.188708 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.439448 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.439954 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76" exitCode=255 Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.440000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76"} Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.440026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7"} Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.440117 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.441262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.441327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4795]: I0320 17:18:04.441344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.189642 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.492084 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4795]: I0320 17:18:05.493725 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:05 crc kubenswrapper[4795]: E0320 17:18:05.501482 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:05 crc kubenswrapper[4795]: E0320 17:18:05.501809 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:06 crc kubenswrapper[4795]: I0320 17:18:06.187288 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:07 crc kubenswrapper[4795]: I0320 17:18:07.187804 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:07 crc kubenswrapper[4795]: E0320 17:18:07.354435 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:08 crc kubenswrapper[4795]: I0320 17:18:08.184770 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:09 crc kubenswrapper[4795]: I0320 17:18:09.187531 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:09 crc kubenswrapper[4795]: W0320 17:18:09.213066 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:18:09 crc kubenswrapper[4795]: E0320 17:18:09.213151 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.187228 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.381180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.381412 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.383011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.383113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.383135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.440251 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.456574 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.458303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.458501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:10 crc kubenswrapper[4795]: I0320 17:18:10.458640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:11 crc kubenswrapper[4795]: I0320 17:18:11.187780 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:11 crc kubenswrapper[4795]: W0320 17:18:11.845101 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:11 crc kubenswrapper[4795]: E0320 17:18:11.845182 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.187037 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.501927 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:12 crc kubenswrapper[4795]: I0320 17:18:12.503454 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:12 crc kubenswrapper[4795]: E0320 17:18:12.509504 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:12 crc kubenswrapper[4795]: E0320 17:18:12.509538 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:13 crc kubenswrapper[4795]: I0320 17:18:13.186840 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:13 crc kubenswrapper[4795]: I0320 17:18:13.440895 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:13 crc kubenswrapper[4795]: I0320 17:18:13.440993 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:13 crc kubenswrapper[4795]: E0320 17:18:13.448108 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c3bb26b5046\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:13 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26b5046 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:13 crc kubenswrapper[4795]: body: Mar 20 17:18:13 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,LastTimestamp:2026-03-20 17:18:13.440968726 +0000 UTC m=+36.899000307,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:13 crc kubenswrapper[4795]: > Mar 20 17:18:13 crc kubenswrapper[4795]: E0320 17:18:13.454984 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c3bb26c53bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26c53bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.441001404 +0000 UTC m=+26.899032995,LastTimestamp:2026-03-20 17:18:13.441029047 +0000 UTC m=+36.899060618,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:18:14 crc kubenswrapper[4795]: I0320 17:18:14.187548 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:15 crc kubenswrapper[4795]: I0320 17:18:15.187245 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.187991 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.252104 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.254033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.254113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.254136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:16 crc kubenswrapper[4795]: I0320 17:18:16.255156 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.188964 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:17 crc kubenswrapper[4795]: E0320 17:18:17.354762 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.479727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.482463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db"} Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.482641 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.483906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.483967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:17 crc kubenswrapper[4795]: I0320 17:18:17.483992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.184154 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.488569 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.489422 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493199 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" exitCode=255 Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db"} Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493353 4795 scope.go:117] "RemoveContainer" containerID="6b019eb48bb06c15bef7785609a6b8ca6556f9fd95b52b880f7917b4f6724ca9" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.493564 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.495129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.495184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.495219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:18 crc kubenswrapper[4795]: I0320 17:18:18.496142 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:18 crc kubenswrapper[4795]: E0320 17:18:18.496492 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.186804 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.498602 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.510527 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:19 crc kubenswrapper[4795]: I0320 17:18:19.512319 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:19 crc kubenswrapper[4795]: E0320 17:18:19.518135 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:19 crc kubenswrapper[4795]: E0320 17:18:19.518345 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:20 crc kubenswrapper[4795]: I0320 17:18:20.186441 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:21 crc kubenswrapper[4795]: W0320 17:18:21.033929 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 17:18:21 crc kubenswrapper[4795]: E0320 17:18:21.033992 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:21 crc kubenswrapper[4795]: I0320 17:18:21.184620 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.171297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.171507 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.173190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.173258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.173279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.174106 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:22 crc kubenswrapper[4795]: E0320 17:18:22.174392 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.187501 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:22 crc kubenswrapper[4795]: W0320 17:18:22.530681 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 17:18:22 crc kubenswrapper[4795]: E0320 17:18:22.530812 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.971848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.972104 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.973905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.973975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.973997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:22 crc kubenswrapper[4795]: I0320 17:18:22.974977 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:22 crc kubenswrapper[4795]: E0320 17:18:22.975281 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:23 crc kubenswrapper[4795]: W0320 17:18:23.146200 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:18:23 crc kubenswrapper[4795]: E0320 17:18:23.146295 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:18:23 crc kubenswrapper[4795]: I0320 17:18:23.188358 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:23 crc kubenswrapper[4795]: I0320 17:18:23.440908 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:23 crc kubenswrapper[4795]: I0320 17:18:23.441014 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:23 crc kubenswrapper[4795]: E0320 17:18:23.447589 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c3bb26b5046\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:18:23 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c3bb26b5046 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:18:23 crc kubenswrapper[4795]: body: Mar 20 17:18:23 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:18:03.440934982 +0000 UTC m=+26.898966563,LastTimestamp:2026-03-20 17:18:23.440986491 +0000 UTC m=+46.899018062,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:18:23 crc kubenswrapper[4795]: > Mar 20 17:18:24 crc kubenswrapper[4795]: I0320 17:18:24.187654 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:25 crc kubenswrapper[4795]: I0320 17:18:25.185936 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.187075 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.518314 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:26 crc kubenswrapper[4795]: I0320 17:18:26.519984 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:26 crc kubenswrapper[4795]: E0320 17:18:26.526220 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:26 crc kubenswrapper[4795]: E0320 17:18:26.526566 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:27 crc kubenswrapper[4795]: I0320 17:18:27.187995 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:27 crc kubenswrapper[4795]: E0320 17:18:27.355040 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:28 crc kubenswrapper[4795]: I0320 17:18:28.187971 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:29 crc kubenswrapper[4795]: I0320 17:18:29.186746 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:30 crc kubenswrapper[4795]: I0320 17:18:30.188436 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:31 crc kubenswrapper[4795]: I0320 17:18:31.187372 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.187023 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.400060 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.400213 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.401316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.401355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.401365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.402978 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.535417 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.536472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.536538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:32 crc kubenswrapper[4795]: I0320 17:18:32.536557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.186376 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.527044 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.528147 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:33 crc kubenswrapper[4795]: E0320 17:18:33.533017 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:33 crc kubenswrapper[4795]: E0320 17:18:33.533243 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.920358 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.920551 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.921676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.921754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:33 crc kubenswrapper[4795]: I0320 17:18:33.921769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:34 crc kubenswrapper[4795]: I0320 17:18:34.184282 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:35 crc kubenswrapper[4795]: I0320 17:18:35.185637 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:36 crc kubenswrapper[4795]: I0320 17:18:36.186801 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.185823 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.251314 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.252498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.252572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.252589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:37 crc kubenswrapper[4795]: I0320 17:18:37.253816 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:37 crc kubenswrapper[4795]: E0320 17:18:37.254061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:37 crc kubenswrapper[4795]: E0320 17:18:37.355844 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:38 crc kubenswrapper[4795]: I0320 17:18:38.184596 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:39 crc kubenswrapper[4795]: I0320 17:18:39.185779 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.188746 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.534013 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:40 crc kubenswrapper[4795]: I0320 17:18:40.535270 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:40 crc kubenswrapper[4795]: E0320 17:18:40.541318 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:18:40 crc kubenswrapper[4795]: E0320 17:18:40.541592 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.193427 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.818287 4795 csr.go:261] certificate signing request csr-l2n5d is approved, waiting to be issued Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.830147 4795 csr.go:257] certificate signing request csr-l2n5d is issued Mar 20 17:18:41 crc kubenswrapper[4795]: I0320 17:18:41.910516 4795 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.028745 4795 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.252084 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.253602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.253642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.253653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.832155 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-24 17:50:32.950082161 +0000 UTC Mar 20 17:18:42 crc kubenswrapper[4795]: I0320 17:18:42.832197 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5976h31m50.117887473s for next certificate rotation Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.356324 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.541830 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.543885 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.554849 4795 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.555186 4795 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.555223 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.559776 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.578806 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.588132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.601559 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.611457 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.627070 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:47 crc kubenswrapper[4795]: I0320 17:18:47.638365 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:47Z","lastTransitionTime":"2026-03-20T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.652858 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.653077 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.653108 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.753950 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.854938 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:47 crc kubenswrapper[4795]: E0320 17:18:47.955988 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.056093 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.156248 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.256351 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.357408 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.458108 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.558501 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.659637 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.760709 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.861837 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:48 crc kubenswrapper[4795]: E0320 17:18:48.962766 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.063733 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.164439 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.251701 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.252962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.253027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.253045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.254034 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.265249 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.366269 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.467362 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.567854 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.579827 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.582086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1"} Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.582253 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.583351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.583392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:49 crc kubenswrapper[4795]: I0320 17:18:49.583404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.668319 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.768805 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.869854 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:49 crc kubenswrapper[4795]: E0320 17:18:49.971054 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.071623 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.172169 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.272737 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.373536 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.474668 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.575393 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.587782 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.588784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591274 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" exitCode=255 Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1"} Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591372 4795 scope.go:117] "RemoveContainer" containerID="37b98782e9665a160ee4c22101e733698efb399833274ecbcab81ae40cea34db" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.591609 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.593294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.593337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.593354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:50 crc kubenswrapper[4795]: I0320 17:18:50.594181 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.594456 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.676270 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.776654 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.877229 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:50 crc kubenswrapper[4795]: E0320 17:18:50.977815 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.078803 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.179133 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.279199 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.380367 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.481087 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.581759 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: I0320 17:18:51.597377 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.681919 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.782257 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.882660 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:51 crc kubenswrapper[4795]: E0320 17:18:51.983744 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.084322 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.171629 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.171939 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.173881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.173958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.173976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.175050 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.175314 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.185012 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.286191 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.386878 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.487152 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.588153 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.688779 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.788922 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.889952 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.972661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.972906 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.974419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.974472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.974489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:52 crc kubenswrapper[4795]: I0320 17:18:52.975569 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.975992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:18:52 crc kubenswrapper[4795]: E0320 17:18:52.990768 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: E0320 17:18:53.091354 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: E0320 17:18:53.191990 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: E0320 17:18:53.292924 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.328360 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.396210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.499373 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.602168 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.704660 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.808268 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:53 crc kubenswrapper[4795]: I0320 17:18:53.910940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:53Z","lastTransitionTime":"2026-03-20T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.013749 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.116740 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.209121 4795 apiserver.go:52] "Watching apiserver" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.214782 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.215125 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.215667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.215734 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.215791 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216176 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216177 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.216726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.216923 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.216815 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219418 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219390 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219946 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.219952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220117 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220769 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220775 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.220811 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.223214 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.223673 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.255073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.269670 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.287511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.288744 4795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.289976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290156 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290187 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290337 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291656 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.291913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292026 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292127 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.290610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.292976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293569 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293588 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.293872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294492 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.294963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295025 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.295106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296236 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296282 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296326 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296462 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.296941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297475 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.297967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298303 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298595 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.298641 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.299491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300578 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.300980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301073 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301124 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301165 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301214 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301262 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301359 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301669 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.301909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302081 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302168 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302171 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302359 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302455 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302500 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302834 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303309 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303622 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304309 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308785 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308842 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309042 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309170 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309466 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309650 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310032 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310089 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310497 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310564 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310628 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310780 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310961 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311142 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311322 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312001 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312063 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302576 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302550 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.302824 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.303907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304394 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.304447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.305659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.307058 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.307371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.307387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308329 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.308897 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309456 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.309834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.310755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311604 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.311704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.313913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.314871 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.314979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.315354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.315990 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.316593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317167 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.318208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.318858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.315926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319662 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.318311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.312788 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.322973 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323551 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.323929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324760 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324775 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324789 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.324944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.325999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.326254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.326714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327244 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327403 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327622 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.327807 4795 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.328859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319591 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.328958 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329059 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.319560 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.329495 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.829450216 +0000 UTC m=+78.287481797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.329971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330426 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330454 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.330489 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.330674 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.830650924 +0000 UTC m=+78.288682475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.317307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.330985 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331196 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.331448 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.831411158 +0000 UTC m=+78.289442709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331476 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331494 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331509 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.331580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.333007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.333988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.334473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.335230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.335629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.337223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.337513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.338177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.339653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.339888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.340238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.340361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.344610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342800 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341482 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.341788 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342582 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.343320 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.345483 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.345497 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.344882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345267 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.342800 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.344787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.345555 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.845535785 +0000 UTC m=+78.303567336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345579 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345600 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345616 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345634 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345650 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345667 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345702 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345720 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345732 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345744 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345754 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345766 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345778 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345790 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345801 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345815 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345826 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345841 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345853 4795 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345867 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345880 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345891 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345903 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345914 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345925 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345937 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345949 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345961 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345972 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345983 4795 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.345994 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346005 4795 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346018 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346031 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346042 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346053 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346064 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346075 4795 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346086 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346196 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.346726 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.347852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.348071 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.349944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.352499 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.354638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.355172 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.355509 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.355898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.356055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.356069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356101 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356132 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356154 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.356353 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.356500 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:54.856446313 +0000 UTC m=+78.314477954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.359002 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.360219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361431 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.361890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.362217 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.363137 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.363156 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.364173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.365098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.365277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.366106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.372183 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.376083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.377274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.378407 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.383811 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.390852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446839 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446868 4795 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446900 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446924 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446947 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.446973 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447058 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447079 4795 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447100 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447121 4795 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447141 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447160 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447177 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447194 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447211 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447228 4795 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447244 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447261 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447281 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447298 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447315 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447333 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447351 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447369 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447387 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447404 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447421 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447438 4795 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447454 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447471 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447488 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447505 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447522 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447539 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447560 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447583 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447618 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447636 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447653 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447671 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447728 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447754 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447775 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447792 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447808 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447827 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447853 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447877 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447902 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447923 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447940 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447957 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447977 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.447994 4795 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448010 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448031 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448057 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448080 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448116 4795 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448140 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448163 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448185 4795 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448264 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448300 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448371 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448413 4795 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448428 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448440 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448452 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448464 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448475 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448486 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448497 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448507 4795 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448519 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448532 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448545 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448556 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448568 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448579 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448590 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448601 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448612 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448623 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448636 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448647 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448659 4795 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448669 4795 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448699 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448711 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448723 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448735 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448746 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448759 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448770 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448781 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448791 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448803 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448814 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448824 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448836 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448846 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448857 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448868 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448879 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448890 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448901 4795 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448912 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448923 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448935 4795 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448948 4795 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448959 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448969 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448981 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.448991 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449002 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449014 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449024 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449035 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449046 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449057 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449067 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449078 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449089 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449101 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449112 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449123 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449138 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449149 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449160 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449172 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449183 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449194 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449205 4795 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449217 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449231 4795 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449246 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449260 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449275 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449288 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449304 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449322 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449338 4795 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449354 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449369 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449384 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449399 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449413 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449428 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449440 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.449451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.538533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.551253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.552352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.552509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.552658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.553051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.553197 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.553376 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 20 17:18:54 crc kubenswrapper[4795]: else Mar 20 17:18:54 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:18:54 crc kubenswrapper[4795]: exit 1 Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.554630 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.565639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:18:54 crc kubenswrapper[4795]: W0320 17:18:54.570403 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414 WatchSource:0}: Error finding container 48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414: Status 404 returned error can't find the container with id 48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414 Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.573556 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:18:54 crc kubenswrapper[4795]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:18:54 crc kubenswrapper[4795]: ho_enable="--enable-hybrid-overlay" Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:18:54 crc kubenswrapper[4795]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:18:54 crc kubenswrapper[4795]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-host=127.0.0.1 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-port=9743 \ Mar 20 17:18:54 crc kubenswrapper[4795]: ${ho_enable} \ Mar 20 17:18:54 crc kubenswrapper[4795]: --enable-interconnect \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-approver \ Mar 20 17:18:54 crc kubenswrapper[4795]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --wait-for-kubernetes-api=200s \ Mar 20 17:18:54 crc kubenswrapper[4795]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.577190 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-webhook \ Mar 20 17:18:54 crc kubenswrapper[4795]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.578478 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.584529 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.585828 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.610182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7c6e2fcdb8d74e6bd0b7cd1a4feb61a303cc77b9d56a711cfd374e3ae28af2a4"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.611102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"48351f663652fe24072c75132f500abceb2237709b403dea504d4e856a175414"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.612406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cd6841410ce51491c24386da36797741c3e636d6a01a69f875e919885c6a19b8"} Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.612828 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:18:54 crc kubenswrapper[4795]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:18:54 crc kubenswrapper[4795]: ho_enable="--enable-hybrid-overlay" Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:18:54 crc kubenswrapper[4795]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:18:54 crc kubenswrapper[4795]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-host=127.0.0.1 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --webhook-port=9743 \ Mar 20 17:18:54 crc kubenswrapper[4795]: ${ho_enable} \ Mar 20 17:18:54 crc kubenswrapper[4795]: --enable-interconnect \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-approver \ Mar 20 17:18:54 crc kubenswrapper[4795]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --wait-for-kubernetes-api=200s \ Mar 20 17:18:54 crc kubenswrapper[4795]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.612837 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.613763 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 20 17:18:54 crc kubenswrapper[4795]: else Mar 20 17:18:54 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:18:54 crc kubenswrapper[4795]: exit 1 Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.615365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.615424 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.616418 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:18:54 crc kubenswrapper[4795]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:18:54 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 20 17:18:54 crc kubenswrapper[4795]: set -o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: source "/env/_master" Mar 20 17:18:54 crc kubenswrapper[4795]: set +o allexport Mar 20 17:18:54 crc kubenswrapper[4795]: fi Mar 20 17:18:54 crc kubenswrapper[4795]: Mar 20 17:18:54 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:18:54 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:18:54 crc kubenswrapper[4795]: --disable-webhook \ Mar 20 17:18:54 crc kubenswrapper[4795]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:18:54 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 20 17:18:54 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:18:54 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.619271 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.621878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.635817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.647589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.656983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.657064 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.658816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.670059 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.684558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.694653 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.705844 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.720762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.732901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.744492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.759173 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.760954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.760998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.761029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.761053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.761064 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853612 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.853948 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.853994 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.853965141 +0000 UTC m=+79.311996712 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.853958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854072 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.854027983 +0000 UTC m=+79.312059564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854101 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854122 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854147 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.854133086 +0000 UTC m=+79.312164857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854162 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854187 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.854257 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.854233869 +0000 UTC m=+79.312265460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.863843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.954960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955305 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955386 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955412 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: E0320 17:18:54.955604 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:55.955570588 +0000 UTC m=+79.413602179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:54 crc kubenswrapper[4795]: I0320 17:18:54.966595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:54Z","lastTransitionTime":"2026-03-20T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.069952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173134 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.173191 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.258544 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.259635 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.262068 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.263306 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.265613 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.266611 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.267940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.269948 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.271250 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.273937 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.275979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276105 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.276777 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.279324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.280494 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.281550 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.283407 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.284499 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.286538 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.287626 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.289060 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.291484 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.292629 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.294717 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.295917 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.298291 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.299358 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.300766 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.302932 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.304097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.306617 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.308198 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.310262 4795 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.310552 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.314111 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.315215 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.316976 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.320099 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.321390 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.323415 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.325180 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.327598 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.328674 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.331043 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.332507 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.334824 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.335829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.337785 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.338916 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.341553 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.342827 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.344788 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.345907 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.347958 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.349476 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.350748 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.379813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.483466 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.587331 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.691166 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.794898 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863518 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.863480036 +0000 UTC m=+81.321511607 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863620 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863659 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863743 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863756 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863822 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.863807625 +0000 UTC m=+81.321839206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.863851 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.863837826 +0000 UTC m=+81.321869407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.863891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.864042 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.864120 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.864101995 +0000 UTC m=+81.322133596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.898376 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:55Z","lastTransitionTime":"2026-03-20T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.959608 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:55 crc kubenswrapper[4795]: I0320 17:18:55.964595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964798 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964830 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964849 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:55 crc kubenswrapper[4795]: E0320 17:18:55.964921 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:57.964899816 +0000 UTC m=+81.422931387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.001555 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.104880 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.207734 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.251194 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.251241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:56 crc kubenswrapper[4795]: E0320 17:18:56.251365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:56 crc kubenswrapper[4795]: E0320 17:18:56.251491 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.251636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:56 crc kubenswrapper[4795]: E0320 17:18:56.251982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.603986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.604118 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.707625 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.810598 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.912998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:56 crc kubenswrapper[4795]: I0320 17:18:56.913022 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:56Z","lastTransitionTime":"2026-03-20T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.015575 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.084085 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118177 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.118230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.220981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.221005 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.267846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.283218 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.299896 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.315887 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.323955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.324080 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.331832 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.345052 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.427554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.530736 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.633433 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.736239 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.839514 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881542 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.881512755 +0000 UTC m=+85.339544326 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881621 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.881641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881764 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.88167834 +0000 UTC m=+85.339709921 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881904 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881941 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881948 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.881965 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.882053 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.88202613 +0000 UTC m=+85.340057711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.882089 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.882070611 +0000 UTC m=+85.340102192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.942969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.960935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.960996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.961018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.961045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.961068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.977997 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.982957 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983188 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983224 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983244 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:57 crc kubenswrapper[4795]: E0320 17:18:57.983328 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:01.983305597 +0000 UTC m=+85.441337178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:57 crc kubenswrapper[4795]: I0320 17:18:57.983399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:57Z","lastTransitionTime":"2026-03-20T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.000114 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.004484 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.020422 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.024537 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.039135 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.047385 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.063879 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.064062 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.066247 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.169155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.251702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.251761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.251792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.251949 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.251992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:58 crc kubenswrapper[4795]: E0320 17:18:58.252058 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.271835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.374994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.375135 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.478408 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.580637 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.683177 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.786792 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.889509 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.992989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:58 crc kubenswrapper[4795]: I0320 17:18:58.993008 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:58Z","lastTransitionTime":"2026-03-20T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.095754 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.198205 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.301238 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.404338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.507664 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.610966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.611083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.713276 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.815928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:59 crc kubenswrapper[4795]: I0320 17:18:59.918675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:59Z","lastTransitionTime":"2026-03-20T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.022251 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.125841 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.228998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.229016 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.251294 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.251392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.251297 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:00 crc kubenswrapper[4795]: E0320 17:19:00.251448 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:00 crc kubenswrapper[4795]: E0320 17:19:00.251550 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:00 crc kubenswrapper[4795]: E0320 17:19:00.251665 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.332277 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.435543 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.538237 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.640814 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.743179 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.845902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.845973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.845999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.846028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.846050 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:00 crc kubenswrapper[4795]: I0320 17:19:00.949306 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:00Z","lastTransitionTime":"2026-03-20T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.052770 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.155984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.156110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258477 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.258520 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.361448 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.464222 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.566674 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.669754 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.773192 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.876773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920458 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.920416868 +0000 UTC m=+93.378448449 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.920679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920630 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920805 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920825 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920854 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920896 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.920907 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.920880173 +0000 UTC m=+93.378911744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.921008 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.920987956 +0000 UTC m=+93.379019537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:01 crc kubenswrapper[4795]: E0320 17:19:01.921039 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:09.921023838 +0000 UTC m=+93.379055559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:01 crc kubenswrapper[4795]: I0320 17:19:01.979755 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:01Z","lastTransitionTime":"2026-03-20T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.021921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.022752 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.022811 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.022926 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.023063 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:10.023035126 +0000 UTC m=+93.481066697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.083273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.186304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.251778 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.251813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.251852 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.251944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.252125 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:02 crc kubenswrapper[4795]: E0320 17:19:02.252251 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.289571 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.392229 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.495170 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.598963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.599097 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.702567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.805442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.806997 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:02 crc kubenswrapper[4795]: I0320 17:19:02.910542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:02Z","lastTransitionTime":"2026-03-20T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.012965 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.115949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.116075 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.218902 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.321835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.424949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.425106 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.528596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.631286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.631629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.631890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.632070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.632248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.735303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.735824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.736006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.736164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.736299 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.839863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:03 crc kubenswrapper[4795]: I0320 17:19:03.943482 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:03Z","lastTransitionTime":"2026-03-20T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047375 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.047433 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.150584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.251638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.251986 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:04 crc kubenswrapper[4795]: E0320 17:19:04.251833 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:04 crc kubenswrapper[4795]: E0320 17:19:04.252224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:04 crc kubenswrapper[4795]: E0320 17:19:04.252351 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.253742 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.356923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.357102 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.459983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.460137 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.563420 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.666513 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.769922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.769987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.770010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.770037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.770073 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.872899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.872968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.872993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.873023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.873045 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:04 crc kubenswrapper[4795]: I0320 17:19:04.975773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:04Z","lastTransitionTime":"2026-03-20T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.078493 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.181165 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.283242 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.386271 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.488726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.590908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.590967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.590988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.591011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.591027 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.693600 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.797460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:05 crc kubenswrapper[4795]: I0320 17:19:05.900780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:05Z","lastTransitionTime":"2026-03-20T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.003911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.107628 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.210153 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.251113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.251223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.251363 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.251739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.252333 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.252570 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.253774 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:19:06 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:19:06 crc kubenswrapper[4795]: set -o allexport Mar 20 17:19:06 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:19:06 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 20 17:19:06 crc kubenswrapper[4795]: else Mar 20 17:19:06 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:19:06 crc kubenswrapper[4795]: exit 1 Mar 20 17:19:06 crc kubenswrapper[4795]: fi Mar 20 17:19:06 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:19:06 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:19:06 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.255040 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.269660 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.270044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.270933 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.312907 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.415952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.518951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.519090 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.622871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.622948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.622971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.623003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.623024 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.641783 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.644113 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:06 crc kubenswrapper[4795]: E0320 17:19:06.644364 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.725878 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.829140 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:06 crc kubenswrapper[4795]: I0320 17:19:06.932234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:06Z","lastTransitionTime":"2026-03-20T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.034534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.136777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.238993 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.264720 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.275675 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.291681 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.306043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.321962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.338557 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.341766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.352076 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.443945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.444063 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.546999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.547119 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.649924 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.650076 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.752953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.855970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.856070 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958761 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:07 crc kubenswrapper[4795]: I0320 17:19:07.958772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:07Z","lastTransitionTime":"2026-03-20T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.060804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.163715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.252028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.252076 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.252208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.252212 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.252276 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.252421 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.265955 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.279971 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.284993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.285099 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.296098 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.300330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.314389 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.319492 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.329603 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.333515 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.344546 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: E0320 17:19:08.344730 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.346322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.448748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.550595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.650641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.650716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.651972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.652078 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.666146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.681185 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.693843 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.704895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.721014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.730587 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.738105 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.754723 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.860275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:08 crc kubenswrapper[4795]: I0320 17:19:08.962807 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:08Z","lastTransitionTime":"2026-03-20T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.065828 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.168748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.272492 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.375467 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.478632 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.581933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.581994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.582012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.582035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.582052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.684604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.787497 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.890378 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991657 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.991870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992067 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992103 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992128 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992159 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992107 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992060021 +0000 UTC m=+109.450091622 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992557 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992534806 +0000 UTC m=+109.450566387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992579 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992567757 +0000 UTC m=+109.450599328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992237 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: E0320 17:19:09.992624 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.992613628 +0000 UTC m=+109.450645199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:09 crc kubenswrapper[4795]: I0320 17:19:09.993564 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:09Z","lastTransitionTime":"2026-03-20T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.093069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093274 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093316 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093337 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.093422 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:26.09339832 +0000 UTC m=+109.551429901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.095787 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198163 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.198188 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.251369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.251516 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.251397 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.251588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.251807 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:10 crc kubenswrapper[4795]: E0320 17:19:10.251912 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.300741 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.402991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.403003 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.506132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.573077 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f47gv"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.573550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.576544 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.577132 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.580216 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.602043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.608378 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.624092 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.643376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.659957 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.680673 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.698195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-hosts-file\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.698322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xsdx\" (UniqueName: \"kubernetes.io/projected/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-kube-api-access-8xsdx\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.699094 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710561 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.710636 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.713516 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.730390 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.799215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xsdx\" (UniqueName: \"kubernetes.io/projected/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-kube-api-access-8xsdx\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.799366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-hosts-file\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.799490 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-hosts-file\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.813169 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.822960 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xsdx\" (UniqueName: \"kubernetes.io/projected/22ee11f2-6451-4d59-8c55-ffcb0ea973a1-kube-api-access-8xsdx\") pod \"node-resolver-f47gv\" (UID: \"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\") " pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.892655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f47gv" Mar 20 17:19:10 crc kubenswrapper[4795]: W0320 17:19:10.907138 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ee11f2_6451_4d59_8c55_ffcb0ea973a1.slice/crio-5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b WatchSource:0}: Error finding container 5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b: Status 404 returned error can't find the container with id 5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.915905 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:10Z","lastTransitionTime":"2026-03-20T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.962519 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zb4r9"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.965183 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xxwb6"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.965523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.965738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mvxvt"] Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.966147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.966543 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxwb6" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.969807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970087 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970873 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.970894 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971066 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971179 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971184 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.971724 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.972016 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.972264 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.972617 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:19:10 crc kubenswrapper[4795]: I0320 17:19:10.988499 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.002070 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.017209 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.020504 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.071759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.087260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102570 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102633 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-system-cni-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-system-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-os-release\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-netns\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8702afd1-abd3-42d0-91e6-048802e98829-rootfs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cni-binary-copy\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-kubelet\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-daemon-config\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.102984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-etc-kubernetes\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-binary-copy\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cnibin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-bin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-multus-certs\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-multus\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-os-release\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvwcq\" (UniqueName: \"kubernetes.io/projected/d4f0d908-7a54-4fb3-a52d-51d088632c62-kube-api-access-pvwcq\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-conf-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103574 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8702afd1-abd3-42d0-91e6-048802e98829-proxy-tls\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmjs\" (UniqueName: \"kubernetes.io/projected/8702afd1-abd3-42d0-91e6-048802e98829-kube-api-access-7xmjs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-cnibin\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-hostroot\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.103974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-k8s-cni-cncf-io\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtbp\" (UniqueName: \"kubernetes.io/projected/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-kube-api-access-zxtbp\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-socket-dir-parent\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.104209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8702afd1-abd3-42d0-91e6-048802e98829-mcd-auth-proxy-config\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.116595 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.122422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.128978 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.143292 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.155908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.169998 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.182865 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.198288 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.204969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-system-cni-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-system-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-system-cni-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-system-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-os-release\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-netns\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-netns\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8702afd1-abd3-42d0-91e6-048802e98829-rootfs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-os-release\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cni-binary-copy\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8702afd1-abd3-42d0-91e6-048802e98829-rootfs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-kubelet\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-daemon-config\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-etc-kubernetes\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-kubelet\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-binary-copy\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205800 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-etc-kubernetes\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cnibin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-bin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cnibin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-multus-certs\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-bin\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.205987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-multus\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-multus-certs\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-os-release\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-var-lib-cni-multus\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvwcq\" (UniqueName: \"kubernetes.io/projected/d4f0d908-7a54-4fb3-a52d-51d088632c62-kube-api-access-pvwcq\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-os-release\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-conf-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206142 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8702afd1-abd3-42d0-91e6-048802e98829-proxy-tls\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206175 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmjs\" (UniqueName: \"kubernetes.io/projected/8702afd1-abd3-42d0-91e6-048802e98829-kube-api-access-7xmjs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-cnibin\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-hostroot\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-k8s-cni-cncf-io\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtbp\" (UniqueName: \"kubernetes.io/projected/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-kube-api-access-zxtbp\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206383 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-cnibin\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-socket-dir-parent\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206417 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-conf-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8702afd1-abd3-42d0-91e6-048802e98829-mcd-auth-proxy-config\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206611 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-host-run-k8s-cni-cncf-io\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206779 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4f0d908-7a54-4fb3-a52d-51d088632c62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-cni-dir\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-hostroot\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-socket-dir-parent\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.206947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.207268 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4f0d908-7a54-4fb3-a52d-51d088632c62-cni-binary-copy\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.207316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-cni-binary-copy\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.207382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-multus-daemon-config\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.208005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8702afd1-abd3-42d0-91e6-048802e98829-mcd-auth-proxy-config\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.209975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8702afd1-abd3-42d0-91e6-048802e98829-proxy-tls\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.215760 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.226669 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.228493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtbp\" (UniqueName: \"kubernetes.io/projected/c8c31a7c-6ccb-43e0-9c95-33b85204cc39-kube-api-access-zxtbp\") pod \"multus-xxwb6\" (UID: \"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\") " pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.234977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvwcq\" (UniqueName: \"kubernetes.io/projected/d4f0d908-7a54-4fb3-a52d-51d088632c62-kube-api-access-pvwcq\") pod \"multus-additional-cni-plugins-zb4r9\" (UID: \"d4f0d908-7a54-4fb3-a52d-51d088632c62\") " pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.236291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.238915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmjs\" (UniqueName: \"kubernetes.io/projected/8702afd1-abd3-42d0-91e6-048802e98829-kube-api-access-7xmjs\") pod \"machine-config-daemon-mvxvt\" (UID: \"8702afd1-abd3-42d0-91e6-048802e98829\") " pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.253308 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.274309 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.287282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.287915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.298961 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.308525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xxwb6" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.309315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: W0320 17:19:11.322319 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8702afd1_abd3_42d0_91e6_048802e98829.slice/crio-f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec WatchSource:0}: Error finding container f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec: Status 404 returned error can't find the container with id f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.327182 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.328389 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.347297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: W0320 17:19:11.348907 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c31a7c_6ccb_43e0_9c95_33b85204cc39.slice/crio-17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21 WatchSource:0}: Error finding container 17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21: Status 404 returned error can't find the container with id 17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21 Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.368560 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.370401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.372316 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.373877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.373972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.374210 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.374397 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.374403 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.375015 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.392996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.406915 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.408846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.408901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.408944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.409814 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.422477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.431647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.433932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.454273 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.469505 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.484227 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.501967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.510654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511215 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511835 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.511913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512218 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.512546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.514533 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.515584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.526333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"ovnkube-node-krk7q\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.530310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.537453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.542440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.554249 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.640719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.641054 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.661884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f47gv" event={"ID":"22ee11f2-6451-4d59-8c55-ffcb0ea973a1","Type":"ContainerStarted","Data":"ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.661937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f47gv" event={"ID":"22ee11f2-6451-4d59-8c55-ffcb0ea973a1","Type":"ContainerStarted","Data":"5bddc656c72b1cdc1b5c19893d8be5682d92142779d32cca871102c8a4e4af6b"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.663557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.663586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"17ca4888cd83f3a23871bc3643dee2b2786931ca8fef8e312ce265cafa74cf21"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.666116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerStarted","Data":"73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.666154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerStarted","Data":"ceab78a93e5ffcad27bdfeb7a60afee5acc9056757c7694e82e32c5abe81a00c"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.669654 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.669706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.669720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f760fe14f2d24a12769306c8249c77b87438579c36c369aa7815e6cd40c149ec"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.678612 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.689999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.695245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:11 crc kubenswrapper[4795]: W0320 17:19:11.711969 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod520bb74b_cfa2_4f21_b561_989b0a3d6adc.slice/crio-1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb WatchSource:0}: Error finding container 1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb: Status 404 returned error can't find the container with id 1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.715310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.732362 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.744887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.749059 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.766139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.778923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.796923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.813776 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.831759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.846879 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.847511 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.858814 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.871012 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.881751 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.897447 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.908559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.923077 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.936810 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.947425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.950208 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:11Z","lastTransitionTime":"2026-03-20T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.963575 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.978839 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:11 crc kubenswrapper[4795]: I0320 17:19:11.993443 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:11Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.009967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.022826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.052651 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.155338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.251067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:12 crc kubenswrapper[4795]: E0320 17:19:12.251292 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.251164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:12 crc kubenswrapper[4795]: E0320 17:19:12.251518 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.251096 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:12 crc kubenswrapper[4795]: E0320 17:19:12.251760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.258348 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.361438 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.465456 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.568959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.569127 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.673591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.676495 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" exitCode=0 Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.676576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.676598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.680185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.684133 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca" exitCode=0 Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.684204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.691208 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.715156 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.734261 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.747790 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.766448 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.777237 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.779244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.792625 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.809663 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.831343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.851585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.866399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.880963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.881399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.882498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.893863 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.907128 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.919577 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.934122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.945640 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.962585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.975542 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.983891 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:12Z","lastTransitionTime":"2026-03-20T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.986991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:12 crc kubenswrapper[4795]: I0320 17:19:12.998068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:12Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.011099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.023804 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.036024 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.086885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.087312 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.189932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.291857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.394428 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496854 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.496897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.599435 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.689004 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158" exitCode=0 Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.689069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.698347 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.701774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.712667 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.741807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.760069 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.787663 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.805233 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.807462 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.828002 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.845767 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.868724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.887303 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.900066 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.907516 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:13Z","lastTransitionTime":"2026-03-20T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.913648 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:13 crc kubenswrapper[4795]: I0320 17:19:13.928462 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:13Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.009927 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.112483 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.215980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.216074 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.251658 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.251787 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.251665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:14 crc kubenswrapper[4795]: E0320 17:19:14.251843 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:14 crc kubenswrapper[4795]: E0320 17:19:14.251954 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:14 crc kubenswrapper[4795]: E0320 17:19:14.252108 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.269368 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.319139 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.421615 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.524569 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.629329 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.704575 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93" exitCode=0 Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.704653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.726592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.731670 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.744635 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.762100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.775629 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.799242 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.825643 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.834348 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.842315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.857739 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.870616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.892970 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.910440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.924966 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:14Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:14 crc kubenswrapper[4795]: I0320 17:19:14.938665 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:14Z","lastTransitionTime":"2026-03-20T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.041518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.041812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.041966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.042116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.042247 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.144261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.145544 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.248509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.248749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.248885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.249116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.249245 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.358737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.359451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.359649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.359913 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.360106 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.463959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.464267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.464462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.464919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.465107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.568225 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.671177 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.716386 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84" exitCode=0 Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.716534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.725047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.744895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.766478 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.773232 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.791456 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.809060 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.833100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.856210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.876284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.881354 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.896563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.911636 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.931901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.951974 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.970534 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.978140 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:15Z","lastTransitionTime":"2026-03-20T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:15 crc kubenswrapper[4795]: I0320 17:19:15.980620 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:15Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.080155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.182137 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.251874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.251938 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.251974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:16 crc kubenswrapper[4795]: E0320 17:19:16.252061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:16 crc kubenswrapper[4795]: E0320 17:19:16.252272 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:16 crc kubenswrapper[4795]: E0320 17:19:16.252430 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.291360 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.394140 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.497201 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.604976 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.708717 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.733278 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d" exitCode=0 Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.733324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.769260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.786891 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.803057 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.811879 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.820192 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.863268 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.893910 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.908807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.914726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:16Z","lastTransitionTime":"2026-03-20T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.917466 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.931291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.944325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.960086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.974607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:16 crc kubenswrapper[4795]: I0320 17:19:16.992580 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:16Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.017372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.120563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.224131 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.284040 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.306336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.326853 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.328652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.329518 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.345988 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pgsfb"] Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.346628 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.350036 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.350248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.353423 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.353455 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.358750 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.373406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13b60165-8101-45a9-91da-d6d1ba46a6cf-host\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.373486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vwz\" (UniqueName: \"kubernetes.io/projected/13b60165-8101-45a9-91da-d6d1ba46a6cf-kube-api-access-t5vwz\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.373539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13b60165-8101-45a9-91da-d6d1ba46a6cf-serviceca\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.375555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.406615 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.427727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.433598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.433853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.434002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.434169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.434296 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.447340 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.467953 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13b60165-8101-45a9-91da-d6d1ba46a6cf-host\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vwz\" (UniqueName: \"kubernetes.io/projected/13b60165-8101-45a9-91da-d6d1ba46a6cf-kube-api-access-t5vwz\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475213 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13b60165-8101-45a9-91da-d6d1ba46a6cf-serviceca\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.475223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/13b60165-8101-45a9-91da-d6d1ba46a6cf-host\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.477328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/13b60165-8101-45a9-91da-d6d1ba46a6cf-serviceca\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.486487 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.506625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vwz\" (UniqueName: \"kubernetes.io/projected/13b60165-8101-45a9-91da-d6d1ba46a6cf-kube-api-access-t5vwz\") pod \"node-ca-pgsfb\" (UID: \"13b60165-8101-45a9-91da-d6d1ba46a6cf\") " pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.507201 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.528618 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.537236 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.545642 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.570633 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.585649 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.603302 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.627299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.639549 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.641922 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.656073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.669115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pgsfb" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.670624 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.683108 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: W0320 17:19:17.685852 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b60165_8101_45a9_91da_d6d1ba46a6cf.slice/crio-f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37 WatchSource:0}: Error finding container f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37: Status 404 returned error can't find the container with id f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37 Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.707011 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.729528 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.741149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744089 4795 generic.go:334] "Generic (PLEG): container finished" podID="d4f0d908-7a54-4fb3-a52d-51d088632c62" containerID="401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea" exitCode=0 Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744994 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.744196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerDied","Data":"401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.748077 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pgsfb" event={"ID":"13b60165-8101-45a9-91da-d6d1ba46a6cf","Type":"ContainerStarted","Data":"f11afad5b1d5a04a287ab7a29596a5019718060a614e9bafab9682fc1ef06b37"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.759363 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.772382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.786078 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.798920 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.813215 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.828089 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.848353 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.855962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.871297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.890042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.910705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.923304 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.933164 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.949616 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:17Z","lastTransitionTime":"2026-03-20T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.953043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.965992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.978827 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:17 crc kubenswrapper[4795]: I0320 17:19:17.991951 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.005350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.052394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.156214 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.251138 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.251153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.251154 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.251260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.251391 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.251613 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.259949 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.363142 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.466607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.569322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.588471 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.607584 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.611465 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.625828 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.629447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.643859 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.647114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.658803 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.661951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.661983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.661994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.662009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.662021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.681997 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: E0320 17:19:18.682229 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.688322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.762380 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.764346 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.764418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.764554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.773621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" event={"ID":"d4f0d908-7a54-4fb3-a52d-51d088632c62","Type":"ContainerStarted","Data":"20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.776434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pgsfb" event={"ID":"13b60165-8101-45a9-91da-d6d1ba46a6cf","Type":"ContainerStarted","Data":"5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.785023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.793485 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.804963 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.804940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.808138 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.823163 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.839247 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.858924 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.888933 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.896727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.909257 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.941880 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.961007 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.972075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.981949 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:18Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:18 crc kubenswrapper[4795]: I0320 17:19:18.998372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:18Z","lastTransitionTime":"2026-03-20T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.009164 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.025149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.036763 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.052382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.064540 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.092627 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.101255 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.111999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.128035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.147413 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.160926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.173463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.204418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.205162 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.224432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.241862 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.263212 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.283178 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.302586 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.312994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.313621 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.416580 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.518954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.624501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.727733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.728734 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.782148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.804358 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.828347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.833241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.850482 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.882148 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.906513 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.930404 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.935407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:19Z","lastTransitionTime":"2026-03-20T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.953940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:19 crc kubenswrapper[4795]: I0320 17:19:19.981123 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:19Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.004265 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.018426 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.029895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.037983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.038051 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.047517 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.059661 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.071921 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.140723 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.243986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.244128 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.251253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:20 crc kubenswrapper[4795]: E0320 17:19:20.251398 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.251480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:20 crc kubenswrapper[4795]: E0320 17:19:20.251562 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.251640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:20 crc kubenswrapper[4795]: E0320 17:19:20.251743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.347383 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.450598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.450993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.451020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.451038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.451055 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.554273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.657891 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.760157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.808286 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/0.log" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.811463 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6" exitCode=1 Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.811519 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.812942 4795 scope.go:117] "RemoveContainer" containerID="4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.839588 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.856736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.862358 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.874051 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.890578 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.901316 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.920616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.934237 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.947176 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.966194 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:20Z","lastTransitionTime":"2026-03-20T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.970226 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:20 crc kubenswrapper[4795]: I0320 17:19:20.988959 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.004851 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.034150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.045946 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.058038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.069325 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.171875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.252609 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:21 crc kubenswrapper[4795]: E0320 17:19:21.252877 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.274440 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.376501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.478999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.479099 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.581640 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.684317 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.786991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.787314 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.818789 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/0.log" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.822108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.823017 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.840989 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.856210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.869469 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.889625 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.899056 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.923372 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.939183 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.953825 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.973287 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.986860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:21Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:21 crc kubenswrapper[4795]: I0320 17:19:21.992556 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:21Z","lastTransitionTime":"2026-03-20T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.023291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.039437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.056940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.068874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.088085 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.095702 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.198409 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.251823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.251869 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.251868 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.251992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.252163 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.252358 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.302391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.405525 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.508542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.611402 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.714588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.817920 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.828118 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.829025 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/0.log" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.832862 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" exitCode=1 Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.832911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.832956 4795 scope.go:117] "RemoveContainer" containerID="4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.833960 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:22 crc kubenswrapper[4795]: E0320 17:19:22.834209 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.859638 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.880503 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.899806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.915842 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.920796 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:22Z","lastTransitionTime":"2026-03-20T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.940005 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.961117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:22 crc kubenswrapper[4795]: I0320 17:19:22.979898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.002341 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:22Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.024975 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.029379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.065101 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.088863 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.108415 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.131562 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.135282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.153484 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234781 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.234829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.334261 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp"] Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.335034 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.338483 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.338716 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.339457 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.351996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.385553 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.407509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.426860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.443202 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.447851 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-env-overrides\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452239 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbgk\" (UniqueName: \"kubernetes.io/projected/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-kube-api-access-jtbgk\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.452473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.466975 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.489428 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.508865 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.529043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.545258 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.546560 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-env-overrides\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbgk\" (UniqueName: \"kubernetes.io/projected/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-kube-api-access-jtbgk\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.553199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.554136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-env-overrides\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.554447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.568897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.579926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4270e94a4d15ee802945d9b65aa1f9c10aaf15afd75c4b031896a8f162a147a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:20Z\\\",\\\"message\\\":\\\"t/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 17:19:20.568148 6528 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:20.568453 6528 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:20.568541 6528 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 17:19:20.568575 6528 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 17:19:20.568620 6528 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:20.568659 6528 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 17:19:20.568734 6528 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 17:19:20.568808 6528 factory.go:656] Stopping watch factory\\\\nI0320 17:19:20.568847 6528 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:20.568897 6528 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 17:19:20.568901 6528 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 17:19:20.569007 6528 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 17:19:20.568911 6528 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:20.568932 6528 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.585757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbgk\" (UniqueName: \"kubernetes.io/projected/3bb75ab9-7e4b-411f-bebe-cf4e2016b031-kube-api-access-jtbgk\") pod \"ovnkube-control-plane-749d76644c-727fp\" (UID: \"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.604579 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.629188 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.648672 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.649270 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.659100 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.663300 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: W0320 17:19:23.673994 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb75ab9_7e4b_411f_bebe_cf4e2016b031.slice/crio-f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579 WatchSource:0}: Error finding container f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579: Status 404 returned error can't find the container with id f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579 Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.756166 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.837049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" event={"ID":"3bb75ab9-7e4b-411f-bebe-cf4e2016b031","Type":"ContainerStarted","Data":"f9796b73deaca7d582a6518de0261f733679cd4c13913a195fda22a726993579"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.839318 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.843484 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:23 crc kubenswrapper[4795]: E0320 17:19:23.843774 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.859816 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.873330 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.892726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.910497 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.928884 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.946711 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.961354 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.966321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:23Z","lastTransitionTime":"2026-03-20T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:23 crc kubenswrapper[4795]: I0320 17:19:23.989099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:23Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.004014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.016989 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.032593 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.046229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.067436 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.069422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.081382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.083138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jpp4c"] Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.083639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.083735 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.096630 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.112427 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.126252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.135755 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.146726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.163530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.163587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxqx\" (UniqueName: \"kubernetes.io/projected/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-kube-api-access-tmxqx\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.166563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.171368 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.178009 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.191224 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.202796 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.220219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.241825 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.251890 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.251951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.252283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.251974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.252525 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.252306 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.258425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.264295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.264411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxqx\" (UniqueName: \"kubernetes.io/projected/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-kube-api-access-tmxqx\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.264734 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.264814 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:24.764793771 +0000 UTC m=+108.222825422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.273375 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.279740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.295125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.295307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxqx\" (UniqueName: \"kubernetes.io/projected/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-kube-api-access-tmxqx\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.308219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.320654 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.335750 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.349806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.375856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.376410 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.479404 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.582919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.686178 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.770291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.770547 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: E0320 17:19:24.770633 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:25.770612736 +0000 UTC m=+109.228644317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.793294 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.847935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" event={"ID":"3bb75ab9-7e4b-411f-bebe-cf4e2016b031","Type":"ContainerStarted","Data":"1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.848648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" event={"ID":"3bb75ab9-7e4b-411f-bebe-cf4e2016b031","Type":"ContainerStarted","Data":"1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.861650 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.874130 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.883625 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.895483 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.896632 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.908440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.921073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.953268 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.970762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.993230 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:24Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998189 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:24 crc kubenswrapper[4795]: I0320 17:19:24.998308 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:24Z","lastTransitionTime":"2026-03-20T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.009623 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.024029 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.040559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.054264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.078086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.095035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.100934 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.111621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:25Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.203901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.204340 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.306907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.306962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.306979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.307002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.307019 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.410482 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.513164 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.615235 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.717993 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.781295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:25 crc kubenswrapper[4795]: E0320 17:19:25.781568 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:25 crc kubenswrapper[4795]: E0320 17:19:25.781653 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:27.781623707 +0000 UTC m=+111.239655288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.821388 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.924972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:25 crc kubenswrapper[4795]: I0320 17:19:25.925101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:25Z","lastTransitionTime":"2026-03-20T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.030613 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.083985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084194 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084153717 +0000 UTC m=+141.542185298 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.084277 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.084362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.084429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084435 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084641 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084675 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084734 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084529 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084733 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084707024 +0000 UTC m=+141.542738605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084883 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084847317 +0000 UTC m=+141.542878898 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.084912 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.084897859 +0000 UTC m=+141.542929440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.133608 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.185513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185804 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185833 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185852 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.185933 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:58.185909907 +0000 UTC m=+141.643941489 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.236951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.237070 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.251713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.251791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.251893 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.252019 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.252060 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.252113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.252233 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:26 crc kubenswrapper[4795]: E0320 17:19:26.252311 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.340328 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.443648 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.546443 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.649486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.753627 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.855795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.857114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:26 crc kubenswrapper[4795]: I0320 17:19:26.960399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:26Z","lastTransitionTime":"2026-03-20T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.063903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.063967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.063984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.064008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.064027 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.167567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.270644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.273174 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.290383 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.307162 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.329889 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.349918 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.373742 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.392157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.425139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.456936 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.476626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.477242 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.498034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.520130 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.535754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.553939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.575107 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.579531 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.593154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:27Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.682842 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.786859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.804438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:27 crc kubenswrapper[4795]: E0320 17:19:27.804627 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:27 crc kubenswrapper[4795]: E0320 17:19:27.804760 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:31.804730903 +0000 UTC m=+115.262762474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.889928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.992987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:27 crc kubenswrapper[4795]: I0320 17:19:27.993086 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:27Z","lastTransitionTime":"2026-03-20T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.096798 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200189 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.200331 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.251972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.252037 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.252068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.252065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252163 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252284 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252368 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.252461 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.303735 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.408552 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.511707 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.615755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.619811 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.703820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.723539 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.728756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.743720 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.748604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.767800 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.772728 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.796145 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.801302 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.823497 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:28Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:28 crc kubenswrapper[4795]: E0320 17:19:28.823673 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.825469 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:28 crc kubenswrapper[4795]: I0320 17:19:28.929472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:28Z","lastTransitionTime":"2026-03-20T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.032363 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.135724 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.238974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.239088 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.341905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.341969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.341988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.342012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.342031 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.445875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.549509 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.652764 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.755843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.859797 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:29 crc kubenswrapper[4795]: I0320 17:19:29.962568 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:29Z","lastTransitionTime":"2026-03-20T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.065811 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.168926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.168985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.169003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.169026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.169044 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251124 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251186 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.251122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251341 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251597 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:30 crc kubenswrapper[4795]: E0320 17:19:30.251748 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271810 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.271857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.374995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.375012 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.478270 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.581526 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684909 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.684951 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.788147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.891512 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995162 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:30 crc kubenswrapper[4795]: I0320 17:19:30.995291 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:30Z","lastTransitionTime":"2026-03-20T17:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.097817 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.200525 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.303841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.303905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.303938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.304009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.304037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.407213 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.510262 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.613424 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.716371 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.819815 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.855639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:31 crc kubenswrapper[4795]: E0320 17:19:31.856292 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:31 crc kubenswrapper[4795]: E0320 17:19:31.856535 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:39.856487598 +0000 UTC m=+123.314519169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.922722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:31 crc kubenswrapper[4795]: I0320 17:19:31.923296 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:31Z","lastTransitionTime":"2026-03-20T17:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.025873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.026425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.128997 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.231624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.231999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.232135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.232249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.232342 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.251593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.251996 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.252060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.252113 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:32 crc kubenswrapper[4795]: E0320 17:19:32.252175 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.335447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.438494 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.541784 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.644624 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.748209 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.851769 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:32 crc kubenswrapper[4795]: I0320 17:19:32.954739 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:32Z","lastTransitionTime":"2026-03-20T17:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.058120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.160925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.161068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.263910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.263966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.263983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.264004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.264022 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.366873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.366946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.366971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.367000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.367021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.486430 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589364 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.589425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.692927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.692992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.693010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.693035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.693052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.797725 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:33 crc kubenswrapper[4795]: I0320 17:19:33.900738 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:33Z","lastTransitionTime":"2026-03-20T17:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.003854 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.107993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.108108 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.211888 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252127 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.252254 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.252339 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.252494 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.252081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:34 crc kubenswrapper[4795]: E0320 17:19:34.253002 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.314985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.315012 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.417845 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.520829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.623639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.727292 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.830730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.831582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.831755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.831889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.832057 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:34 crc kubenswrapper[4795]: I0320 17:19:34.935935 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:34Z","lastTransitionTime":"2026-03-20T17:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.039446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.039818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.039978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.040116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.040243 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.142887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.245460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.251756 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.347932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.455955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.456089 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.560849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.663930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.767359 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.870803 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.896419 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.900869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.901540 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.933824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.959003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.974982 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:35Z","lastTransitionTime":"2026-03-20T17:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:35 crc kubenswrapper[4795]: I0320 17:19:35.983082 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.003236 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:35Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.035599 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.053973 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.073406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.077676 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.093068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.108509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.129866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.148118 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.172250 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.180898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.180965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.180984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.181012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.181029 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.190754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.223488 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.246217 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251440 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.251557 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.251551 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.251740 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.251776 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:36 crc kubenswrapper[4795]: E0320 17:19:36.252003 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.269847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:36Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.284633 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.388623 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.491259 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.593972 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.696898 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.799995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.800013 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:36 crc kubenswrapper[4795]: I0320 17:19:36.902942 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:36Z","lastTransitionTime":"2026-03-20T17:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.006554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:37Z","lastTransitionTime":"2026-03-20T17:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.109226 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:37Z","lastTransitionTime":"2026-03-20T17:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:37 crc kubenswrapper[4795]: E0320 17:19:37.210400 4795 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.252731 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.269360 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.295945 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.316055 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.336656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.352641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: E0320 17:19:37.376489 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.376351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.396411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.421698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.440824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.472651 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.492984 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.531172 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.556957 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.574980 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.595470 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.613939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.911828 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.919123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212"} Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.919723 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.942479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.957745 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.972960 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:37 crc kubenswrapper[4795]: I0320 17:19:37.986401 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:37Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.002174 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.013721 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.042372 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.057405 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.085813 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.102346 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.129042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.149167 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.162095 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.175025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.195169 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.207260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.251725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.251788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.251817 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.251795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.251950 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.252039 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.252100 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.252280 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.926151 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.927118 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/1.log" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.933916 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" exitCode=1 Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.933980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212"} Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.934029 4795 scope.go:117] "RemoveContainer" containerID="14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.934718 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.934934 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.962370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.967996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.968010 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:38Z","lastTransitionTime":"2026-03-20T17:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.986131 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: E0320 17:19:38.986287 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:38 crc kubenswrapper[4795]: I0320 17:19:38.991538 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:38Z","lastTransitionTime":"2026-03-20T17:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.010865 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.015983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016091 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:39Z","lastTransitionTime":"2026-03-20T17:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.016252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.033605 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.038774 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.045742 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:39Z","lastTransitionTime":"2026-03-20T17:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.065236 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.069448 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:39Z","lastTransitionTime":"2026-03-20T17:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.074522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14e24867d7d994dbb949bdc01ee46c29066970b150367f4412c721a3055a4302\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:21Z\\\",\\\"message\\\":\\\"11] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849244 6742 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 17:19:21.849401 6742 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 17:19:21.850776 6742 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 17:19:21.850804 6742 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 17:19:21.850832 6742 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 17:19:21.850838 6742 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 17:19:21.850855 6742 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 17:19:21.850876 6742 factory.go:656] Stopping watch factory\\\\nI0320 17:19:21.850888 6742 ovnkube.go:599] Stopped ovnkube\\\\nI0320 17:19:21.850916 6742 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 17:19:21.850925 6742 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 17:19:21.850923 6742 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 17:19:21.850934 6742 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 17:19:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.088218 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.089295 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.089382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.119954 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.142397 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.161514 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.180774 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.196791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.216176 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.237362 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.255253 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.275555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.291011 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.940976 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.947349 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.947715 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.955630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.955865 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:39 crc kubenswrapper[4795]: E0320 17:19:39.955992 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:19:55.955957232 +0000 UTC m=+139.413988803 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.969406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:39 crc kubenswrapper[4795]: I0320 17:19:39.987087 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:39Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.021308 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.040510 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.066083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.088149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.110987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.131335 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.147309 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.180568 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.197246 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.215295 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.228386 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.246342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251172 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251322 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.251356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251456 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:40 crc kubenswrapper[4795]: E0320 17:19:40.251659 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.266085 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:40 crc kubenswrapper[4795]: I0320 17:19:40.283405 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:40Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252114 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252487 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252138 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.252119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252648 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.252767 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:42 crc kubenswrapper[4795]: I0320 17:19:42.262672 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 17:19:42 crc kubenswrapper[4795]: E0320 17:19:42.378016 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252047 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252075 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252198 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:44 crc kubenswrapper[4795]: I0320 17:19:44.252233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252453 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252592 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:44 crc kubenswrapper[4795]: E0320 17:19:44.252748 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.252430 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.253888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.253908 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:46 crc kubenswrapper[4795]: I0320 17:19:46.254055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.261008 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.261840 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.262025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:46 crc kubenswrapper[4795]: E0320 17:19:46.261789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.271102 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.293900 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.317855 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.348157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.365076 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: E0320 17:19:47.378756 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.386429 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.402411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.419664 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.436246 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.451228 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.465806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.498010 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.517531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.535537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.553576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.572724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:47 crc kubenswrapper[4795]: I0320 17:19:47.588339 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:47Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251772 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:48 crc kubenswrapper[4795]: I0320 17:19:48.251742 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.251939 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.252057 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.252224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:48 crc kubenswrapper[4795]: E0320 17:19:48.252285 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.428257 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.449050 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.455256 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.478493 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.483738 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.503955 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.508965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.509108 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.528938 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.533988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:49 crc kubenswrapper[4795]: I0320 17:19:49.534010 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:49Z","lastTransitionTime":"2026-03-20T17:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.554202 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:49Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:49 crc kubenswrapper[4795]: E0320 17:19:49.554914 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.251973 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.252010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.252136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252292 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:50 crc kubenswrapper[4795]: I0320 17:19:50.252327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252492 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252637 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:50 crc kubenswrapper[4795]: E0320 17:19:50.252873 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.251673 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.251794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.251907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252088 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.252126 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252266 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252398 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.252563 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:52 crc kubenswrapper[4795]: E0320 17:19:52.380636 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:52 crc kubenswrapper[4795]: I0320 17:19:52.979105 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.003437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:52Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.019874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.039551 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.062326 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.080222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.117018 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.134537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.156857 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.175926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.196344 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.214396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.233175 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.248912 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.282511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.304647 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.322587 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:53 crc kubenswrapper[4795]: I0320 17:19:53.340834 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:53Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251373 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.251514 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.251564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.251813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.251940 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.252122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:54 crc kubenswrapper[4795]: I0320 17:19:54.253167 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:19:54 crc kubenswrapper[4795]: E0320 17:19:54.253433 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:19:55 crc kubenswrapper[4795]: I0320 17:19:55.458897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.042497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.042738 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.043147 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:20:28.043119656 +0000 UTC m=+171.501151227 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.251738 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.251809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.251738 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.251947 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.252126 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.252285 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:56 crc kubenswrapper[4795]: I0320 17:19:56.252549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:56 crc kubenswrapper[4795]: E0320 17:19:56.252834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.268779 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.290570 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.310289 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.334478 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.353470 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: E0320 17:19:57.381484 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.389237 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.410647 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.426967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.439132 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.452989 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.470417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.489043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.501904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.532965 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.551610 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.569775 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.587312 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:57 crc kubenswrapper[4795]: I0320 17:19:57.608034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:57Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.018565 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/0.log" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.018626 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" containerID="e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d" exitCode=1 Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.018672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerDied","Data":"e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d"} Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.019115 4795 scope.go:117] "RemoveContainer" containerID="e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.038732 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.055083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.068366 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.090507 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.102414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.136159 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.154900 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.164831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.164965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165029 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.165001083 +0000 UTC m=+205.623032634 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165111 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165132 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165148 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165215 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.165192918 +0000 UTC m=+205.623224549 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.165146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.165348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165236 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165498 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.165458266 +0000 UTC m=+205.623489857 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165500 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.165605 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.16558398 +0000 UTC m=+205.623615621 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.177888 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.193607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.210450 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.219574 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.236776 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.250501 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251674 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.251885 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.251984 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.252205 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.252382 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.252471 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.261096 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.266619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266778 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266794 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266805 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: E0320 17:19:58.266846 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:02.266834196 +0000 UTC m=+205.724865737 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.269894 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.284154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.300359 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.318420 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:58 crc kubenswrapper[4795]: I0320 17:19:58.335909 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:58Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.025245 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/0.log" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.025378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86"} Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.046271 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.061003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.077132 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.097520 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.116625 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.141299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.158956 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.190917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.207893 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.229262 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.251876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.272099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.291833 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.311948 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.325634 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.346999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.360319 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.369376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.379367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.881973 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.904091 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.909205 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.933464 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.938629 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.958568 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.963745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.963976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.964142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.964292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.964447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:19:59 crc kubenswrapper[4795]: E0320 17:19:59.986027 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:59Z is after 2025-08-24T17:21:41Z" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.990710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.990881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.991016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.991162 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:19:59 crc kubenswrapper[4795]: I0320 17:19:59.991474 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:19:59Z","lastTransitionTime":"2026-03-20T17:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.011579 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:00Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.011948 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.251789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.252050 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.252283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:00 crc kubenswrapper[4795]: I0320 17:20:00.251495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:00 crc kubenswrapper[4795]: E0320 17:20:00.252531 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251171 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251280 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:02 crc kubenswrapper[4795]: I0320 17:20:02.251204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251482 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251604 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.251811 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:02 crc kubenswrapper[4795]: E0320 17:20:02.382448 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.251971 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.252020 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.252056 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:04 crc kubenswrapper[4795]: I0320 17:20:04.251993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252217 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252344 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252497 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:04 crc kubenswrapper[4795]: E0320 17:20:04.252599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:05 crc kubenswrapper[4795]: I0320 17:20:05.252315 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.053235 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.056614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.057296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.070895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.084003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.095352 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.110616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.122189 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.141313 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.154826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.174730 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.189593 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.202669 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.215367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.230023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.242201 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.251532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.251576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.251736 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.251828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.251972 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.252013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.252106 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:06 crc kubenswrapper[4795]: E0320 17:20:06.252226 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.277576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.294920 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.315005 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.332642 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.355238 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:06 crc kubenswrapper[4795]: I0320 17:20:06.374098 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:06Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.069671 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.071101 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/2.log" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.075389 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" exitCode=1 Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.075439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.075487 4795 scope.go:117] "RemoveContainer" containerID="9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.076481 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:07 crc kubenswrapper[4795]: E0320 17:20:07.076839 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.091213 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.109241 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.121699 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.135343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.150830 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.164293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.179571 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.196599 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.210876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.234134 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.263834 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.281096 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.294532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.308264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.321571 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.334338 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.347401 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.361865 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.375299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: E0320 17:20:07.382825 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.387106 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.399655 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.408747 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.422924 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.448241 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a56515366d02a85c25d74ff75760117100cf890033d825202abc90ed3909212\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:38Z\\\",\\\"message\\\":\\\"ormer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:19:38Z is after 2025-08-24T17:21:41Z]\\\\nI0320 17:19:38.365471 7023 services_controller.go:443] Built service openshift-marketplace/redhat-marketplace LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:50051, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0320 17:19:38.366014 7023 event.go:377] Event(v1.ObjectReference{Kind:\\\\\\\"Pod\\\\\\\", Namespace:\\\\\\\"openshift-multus\\\\\\\", Name:\\\\\\\"network-metrics-daemon-jpp4c\\\\\\\", UID:\\\\\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\\\\\", APIVersion:\\\\\\\"v1\\\\\\\", ResourceVersion:\\\\\\\"26951\\\\\\\", FieldPath:\\\\\\\"\\\\\\\"}): type: 'Warning' reason: 'ErrorAddingResource' addLogicalPort failed for openshift-multus/network-metrics-daemon-jpp4c: failed to update pod openshift-multus/network-m\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.460475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.472882 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.482824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.492355 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.505577 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.516639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.527847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.554562 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.567749 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.584248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.594873 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.608493 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.631602 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:07 crc kubenswrapper[4795]: I0320 17:20:07.646621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:07Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.078989 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.082755 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.082937 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.095856 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.111336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.127496 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.140404 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.169087 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.181525 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.195860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.208856 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.227589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.245815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.251655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.251752 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.251809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.251982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.252140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.252411 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.252539 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:08 crc kubenswrapper[4795]: E0320 17:20:08.252615 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.262908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.276526 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.307205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.327895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.342356 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.357381 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.376423 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.393608 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:08 crc kubenswrapper[4795]: I0320 17:20:08.409917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:08Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.054307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.078186 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.084297 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.099930 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.104875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.126569 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.130588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.147104 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.151964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.152049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:10Z","lastTransitionTime":"2026-03-20T17:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.167165 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:10Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.167409 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252071 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:10 crc kubenswrapper[4795]: I0320 17:20:10.252188 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252573 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:10 crc kubenswrapper[4795]: E0320 17:20:10.252799 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251815 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.251855 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:12 crc kubenswrapper[4795]: I0320 17:20:12.251781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.252009 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.252029 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.252160 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:12 crc kubenswrapper[4795]: E0320 17:20:12.383981 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251216 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.251499 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:14 crc kubenswrapper[4795]: I0320 17:20:14.251729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.251717 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.251906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:14 crc kubenswrapper[4795]: E0320 17:20:14.252125 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.251145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.251303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.252263 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:16 crc kubenswrapper[4795]: I0320 17:20:16.252304 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.252458 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.252604 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.252856 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:16 crc kubenswrapper[4795]: E0320 17:20:16.253031 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.271917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8ec578e2f42c1f8129d84e7f2b6a7da5c6ab7b7fb914570caeb23c2a7f047d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.293982 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xxwb6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c31a7c-6ccb-43e0-9c95-33b85204cc39\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:19:57Z\\\",\\\"message\\\":\\\"2026-03-20T17:19:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2\\\\n2026-03-20T17:19:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9ee55963-c513-477d-92ae-45db62cd33f2 to /host/opt/cni/bin/\\\\n2026-03-20T17:19:12Z [verbose] multus-daemon started\\\\n2026-03-20T17:19:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T17:19:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxtbp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xxwb6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.312672 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb75ab9-7e4b-411f-bebe-cf4e2016b031\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dd89d38d0f647f0824776b33d5fb69ed80f06428cfc993b43aece2d8821ba85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fd620d94f6600b03107b98bdf30b1cf3755202f5fab266499f2048a6760bec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtbgk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-727fp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.329187 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048942bb-32ac-441f-8850-5ddbefab0fcd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27138860cccc1d68d72f2be283cc4c433cba143c8ea78e3941520eeedf8cc65b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35c7829998912c443753c9ddcfc0eee8b3bd465d00675f16c1d82975b48d410d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.350903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.365370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f47gv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22ee11f2-6451-4d59-8c55-ffcb0ea973a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef71c80a3b92711836d0dc4c98dd2fd0e39815db7d56b9036a191bbfd7aac28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8xsdx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f47gv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: E0320 17:20:17.384522 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.388592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22f4248a-e081-4e6f-9dde-6641fb13d92f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 17:18:49.858066 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 17:18:49.858334 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:18:49.859441 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3021802780/tls.crt::/tmp/serving-cert-3021802780/tls.key\\\\\\\"\\\\nI0320 17:18:50.453820 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 17:18:50.455305 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 17:18:50.455322 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 17:18:50.455340 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 17:18:50.455344 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 17:18:50.459261 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 17:18:50.459270 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 17:18:50.459294 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459303 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 17:18:50.459314 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 17:18:50.459319 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 17:18:50.459324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 17:18:50.459329 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 17:18:50.463054 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:18:49Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.406264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20803043-e156-4742-b372-0dbe96db3809\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ace54ce6d6afb5b4f37973db32c9a7f9acd943398911c9b91dde619c7bed1ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://caa56ce9a79a59907cf29ccc0daf0d6729248521c900fbd76fc8b8abe02cbe76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 17:17:39.721569 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 17:17:39.747774 1 observer_polling.go:159] Starting file observer\\\\nI0320 17:17:40.083456 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 17:17:40.123077 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 17:18:03.452615 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 17:18:03.452790 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c898e7d0419e934cf20bf2d5ddf74dd9316431af3255ccf60ae40b6c8899e5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b189293786f7bde1afdd4946afd4edf20f6eeae27b3389dea40587d7c561d654\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.423057 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.439485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4f0d908-7a54-4fb3-a52d-51d088632c62\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20190bd604ad2827cef6657aa25230f86009c1f91264c20ea375b30961e6bf1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73ed65e91dd882d1231b31f17319079ff4bd41a38c62828d7ff892b35ea22eca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c099f7771313a2e5f5b2eb7d64e349efff17ca6d94bf973c314c1b3f32df158\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://895980d2f267223d701da1e62872094cd859e6a6e80944dbd11304dac117eb93\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df39b7be70ede5c5c4acdf47e4ac1985ab25b7260620f8899b905a492910de84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3101e115f6b29a14b8dcc26384020718b9c52d2f670182abd9cb88356ad17d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://401bd6bed2f18ad354ff36a216054c9119f67d4734aa4dc09e421d3e5c4605ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pvwcq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zb4r9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.451986 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8702afd1-abd3-42d0-91e6-048802e98829\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://919babb586c420c3c3a902ee86b219a0456f4dfe85a4e1dd25012ecf962d905e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7xmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mvxvt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.471944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"520bb74b-cfa2-4f21-b561-989b0a3d6adc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T17:20:06Z\\\",\\\"message\\\":\\\"uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173240 7299 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 17:20:06.173280 7299 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-apiserver-operator/metrics]} name:Service_openshift-kube-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.109:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {1f62a432-33b9-495d-83b2-d1dbe6961325}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 17:20:06.173377 7299 services_controller.go:452] Built service openshift-console/downloads per-node LB for network=default: []services.LB{}\\\\nF0320 17:20:06.173391 7299 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4vrl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-krk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.484581 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmxqx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jpp4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.511793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bda5cfbb-a167-4ccc-b684-008006c1eece\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aab571bec4dc614de47c830bccf099bb39e7875550afbebafffce846dd54be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be33e1f92218be72bcbafdf5c2fd6e8f38fc04d50726c1566e89ed44bcc0e5b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c5c16da9fd8140e2121b6e03288856bc9ed33712de868b25839691c08da3895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88d4bad9a9419d4126dfa3cf380c178eec4dea0f0271ac3e4479eef2817a28ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b5ec09b193b47301a7c0be9f3533112ba3808befac4e6d40c54c2054ff6e9a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69323d79dafb7570ebde1362b2bc683e8b8aac8951d61512b2fe25f00dd84317\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6091fdf2885642a88b96a59635c80db11c272fc6548108e5e64fdf73b2b7c7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b40fb2dc1945e105016286512aeb37f3e3ee2e5990aafb4c3bfd6ab064a086d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.527275 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97591979-12db-4b48-aab3-7215af398852\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37850b52ac9cd1a009d4b09529a28e6ecd0f4ce917b1d75ea801e6cca36d1ee6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c48ff74e456a808173a95b6148ff24386d319f5d6215ea0d187864b2693832b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://697ba54372b625ba70d018a01e93610fa98da99d1ae3db83c0f1fe020132f0f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be3e6374940088f37f38327f410016493cbfcc93dbee432c680a7e50b0fd91d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.546181 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fe796ea17b31fcb5fa8c3878e4be002ca602fa0828ee6cbb6b82ab42772da7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.561317 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.579952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92e7b2b11f803b496590fa76b8318ade554e3ce41f49e536ec127989a671144c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a89a1191e22514c45913fe92ad5b1d8684591d81925c5314add5304b1564203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:17 crc kubenswrapper[4795]: I0320 17:20:17.592275 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pgsfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b60165-8101-45a9-91da-d6d1ba46a6cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e12076b20254135dfffca7b5e9c08269789b12b608c26c1f737874ced9b490f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:19:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5vwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:19:17Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pgsfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:17Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251621 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:18 crc kubenswrapper[4795]: I0320 17:20:18.251805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.251798 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.251974 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.252353 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:18 crc kubenswrapper[4795]: E0320 17:20:18.252628 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:19 crc kubenswrapper[4795]: I0320 17:20:19.253047 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:19 crc kubenswrapper[4795]: E0320 17:20:19.253323 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251452 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.251508 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.251555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.251836 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.251952 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.252125 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354179 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.354309 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.374207 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.378915 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.398363 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.403332 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.423555 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.428808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.450819 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:20 crc kubenswrapper[4795]: I0320 17:20:20.456228 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:20Z","lastTransitionTime":"2026-03-20T17:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.477125 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:20:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"55dab564-f3ba-4083-bf1a-aa261eb80746\\\",\\\"systemUUID\\\":\\\"14ef5e9e-707f-4ad8-89b5-1abff10c4fa0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:20:20Z is after 2025-08-24T17:21:41Z" Mar 20 17:20:20 crc kubenswrapper[4795]: E0320 17:20:20.477307 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252299 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.252333 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:22 crc kubenswrapper[4795]: I0320 17:20:22.252339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.252824 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.253133 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.253245 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:22 crc kubenswrapper[4795]: E0320 17:20:22.385742 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251540 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.253801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251865 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.253949 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:24 crc kubenswrapper[4795]: I0320 17:20:24.251656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.254090 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:24 crc kubenswrapper[4795]: E0320 17:20:24.254194 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.252186 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252457 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.252556 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:26 crc kubenswrapper[4795]: I0320 17:20:26.252853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.252950 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:26 crc kubenswrapper[4795]: E0320 17:20:26.253089 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.302263 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-727fp" podStartSLOduration=114.30223068 podStartE2EDuration="1m54.30223068s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.279648801 +0000 UTC m=+170.737680432" watchObservedRunningTime="2026-03-20 17:20:27.30223068 +0000 UTC m=+170.760262251" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.343567 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xxwb6" podStartSLOduration=114.343534349 podStartE2EDuration="1m54.343534349s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.325523681 +0000 UTC m=+170.783555262" watchObservedRunningTime="2026-03-20 17:20:27.343534349 +0000 UTC m=+170.801565920" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.360552 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f47gv" podStartSLOduration=114.360523276 podStartE2EDuration="1m54.360523276s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.344233991 +0000 UTC m=+170.802265562" watchObservedRunningTime="2026-03-20 17:20:27.360523276 +0000 UTC m=+170.818554857" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.361099 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.361089233 podStartE2EDuration="29.361089233s" podCreationTimestamp="2026-03-20 17:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.360362791 +0000 UTC m=+170.818394372" watchObservedRunningTime="2026-03-20 17:20:27.361089233 +0000 UTC m=+170.819120804" Mar 20 17:20:27 crc kubenswrapper[4795]: E0320 17:20:27.386631 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.469401 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podStartSLOduration=114.469375587 podStartE2EDuration="1m54.469375587s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.468554081 +0000 UTC m=+170.926585642" watchObservedRunningTime="2026-03-20 17:20:27.469375587 +0000 UTC m=+170.927407178" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.469761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zb4r9" podStartSLOduration=114.469752378 podStartE2EDuration="1m54.469752378s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.447336954 +0000 UTC m=+170.905368535" watchObservedRunningTime="2026-03-20 17:20:27.469752378 +0000 UTC m=+170.927783959" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.535361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.535343 podStartE2EDuration="1m21.535343s" podCreationTimestamp="2026-03-20 17:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.534797023 +0000 UTC m=+170.992828604" watchObservedRunningTime="2026-03-20 17:20:27.535343 +0000 UTC m=+170.993374551" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.552194 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=32.552173031 podStartE2EDuration="32.552173031s" podCreationTimestamp="2026-03-20 17:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.550806469 +0000 UTC m=+171.008838030" watchObservedRunningTime="2026-03-20 17:20:27.552173031 +0000 UTC m=+171.010204612" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.643285 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pgsfb" podStartSLOduration=114.643264212 podStartE2EDuration="1m54.643264212s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.642506638 +0000 UTC m=+171.100538219" watchObservedRunningTime="2026-03-20 17:20:27.643264212 +0000 UTC m=+171.101295753" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.669889 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=73.669872226 podStartE2EDuration="1m13.669872226s" podCreationTimestamp="2026-03-20 17:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.668340899 +0000 UTC m=+171.126372450" watchObservedRunningTime="2026-03-20 17:20:27.669872226 +0000 UTC m=+171.127903767" Mar 20 17:20:27 crc kubenswrapper[4795]: I0320 17:20:27.682717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.682668803 podStartE2EDuration="45.682668803s" podCreationTimestamp="2026-03-20 17:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:27.682034593 +0000 UTC m=+171.140066144" watchObservedRunningTime="2026-03-20 17:20:27.682668803 +0000 UTC m=+171.140700354" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.129879 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.130143 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.130256 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs podName:996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77 nodeName:}" failed. No retries permitted until 2026-03-20 17:21:32.130228264 +0000 UTC m=+235.588259835 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs") pod "network-metrics-daemon-jpp4c" (UID: "996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.251653 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.251669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.251785 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.251969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:28 crc kubenswrapper[4795]: I0320 17:20:28.252046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.252147 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.252233 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:28 crc kubenswrapper[4795]: E0320 17:20:28.252743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.252226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252340 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252438 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252615 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:30 crc kubenswrapper[4795]: E0320 17:20:30.252820 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.688912 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:20:30Z","lastTransitionTime":"2026-03-20T17:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.757351 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm"] Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.757901 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.761657 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.761922 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.762078 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.766739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.853964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6c553b-299b-4aaf-945a-81fc44d50569-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e6c553b-299b-4aaf-945a-81fc44d50569-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6c553b-299b-4aaf-945a-81fc44d50569-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.854102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e6c553b-299b-4aaf-945a-81fc44d50569-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6c553b-299b-4aaf-945a-81fc44d50569-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.955763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.956075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6c553b-299b-4aaf-945a-81fc44d50569-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.956167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.956259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9e6c553b-299b-4aaf-945a-81fc44d50569-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.957243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9e6c553b-299b-4aaf-945a-81fc44d50569-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.966677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6c553b-299b-4aaf-945a-81fc44d50569-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:30 crc kubenswrapper[4795]: I0320 17:20:30.987020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6c553b-299b-4aaf-945a-81fc44d50569-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5gtvm\" (UID: \"9e6c553b-299b-4aaf-945a-81fc44d50569\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.081106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" Mar 20 17:20:31 crc kubenswrapper[4795]: W0320 17:20:31.106370 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6c553b_299b_4aaf_945a_81fc44d50569.slice/crio-77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453 WatchSource:0}: Error finding container 77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453: Status 404 returned error can't find the container with id 77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453 Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.168984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" event={"ID":"9e6c553b-299b-4aaf-945a-81fc44d50569","Type":"ContainerStarted","Data":"77cdc77d93a20ba0405bb54b910d2e2cd6164b5e8132dc3f53f6249798ad2453"} Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.284822 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 17:20:31 crc kubenswrapper[4795]: I0320 17:20:31.292800 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.174045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" event={"ID":"9e6c553b-299b-4aaf-945a-81fc44d50569","Type":"ContainerStarted","Data":"b80a75e01e186a29a1d36b898a85a77f2379e025a1bd6957f193a1fe3ca00dc8"} Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.191616 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5gtvm" podStartSLOduration=119.191591175 podStartE2EDuration="1m59.191591175s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:32.190863123 +0000 UTC m=+175.648894704" watchObservedRunningTime="2026-03-20 17:20:32.191591175 +0000 UTC m=+175.649622746" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.251883 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.251980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.252008 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:32 crc kubenswrapper[4795]: I0320 17:20:32.251883 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252176 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.252435 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:32 crc kubenswrapper[4795]: E0320 17:20:32.388444 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:33 crc kubenswrapper[4795]: I0320 17:20:33.253134 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:33 crc kubenswrapper[4795]: E0320 17:20:33.253471 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.251831 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.251903 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.251837 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.251997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:34 crc kubenswrapper[4795]: I0320 17:20:34.252105 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.252149 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.252249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:34 crc kubenswrapper[4795]: E0320 17:20:34.252477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.251861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.251950 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.251957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:36 crc kubenswrapper[4795]: I0320 17:20:36.252083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252258 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252463 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:36 crc kubenswrapper[4795]: E0320 17:20:36.252523 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:37 crc kubenswrapper[4795]: E0320 17:20:37.389067 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:38 crc kubenswrapper[4795]: I0320 17:20:38.251197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251269 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251405 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:38 crc kubenswrapper[4795]: E0320 17:20:38.251501 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252039 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251575 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:40 crc kubenswrapper[4795]: I0320 17:20:40.251536 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252218 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252340 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:40 crc kubenswrapper[4795]: E0320 17:20:40.252461 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252067 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:42 crc kubenswrapper[4795]: I0320 17:20:42.251858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252262 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252358 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.252422 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:42 crc kubenswrapper[4795]: E0320 17:20:42.390743 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.223598 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224288 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/0.log" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224335 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" exitCode=1 Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerDied","Data":"c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86"} Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.224420 4795 scope.go:117] "RemoveContainer" containerID="e489315392bafe2d9a5368211b3c8a511522e01f2abd49a19e944e591306f38d" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.225131 4795 scope.go:117] "RemoveContainer" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.225433 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xxwb6_openshift-multus(c8c31a7c-6ccb-43e0-9c95-33b85204cc39)\"" pod="openshift-multus/multus-xxwb6" podUID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.252086 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.252311 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.252669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.252836 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.253129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.253257 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:44 crc kubenswrapper[4795]: I0320 17:20:44.253490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:44 crc kubenswrapper[4795]: E0320 17:20:44.253623 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:45 crc kubenswrapper[4795]: I0320 17:20:45.230999 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:20:45 crc kubenswrapper[4795]: I0320 17:20:45.252852 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:20:45 crc kubenswrapper[4795]: E0320 17:20:45.253133 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-krk7q_openshift-ovn-kubernetes(520bb74b-cfa2-4f21-b561-989b0a3d6adc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.251083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.251197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.251252 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.251334 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.251628 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.251725 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:46 crc kubenswrapper[4795]: I0320 17:20:46.253117 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:46 crc kubenswrapper[4795]: E0320 17:20:46.253487 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:47 crc kubenswrapper[4795]: E0320 17:20:47.391398 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.251607 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:48 crc kubenswrapper[4795]: I0320 17:20:48.251486 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.251901 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.252019 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:48 crc kubenswrapper[4795]: E0320 17:20:48.252061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252023 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252204 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252386 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:50 crc kubenswrapper[4795]: I0320 17:20:50.252341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:50 crc kubenswrapper[4795]: E0320 17:20:50.252651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251596 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.251831 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:52 crc kubenswrapper[4795]: I0320 17:20:52.251862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.252034 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.252164 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.252435 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:52 crc kubenswrapper[4795]: E0320 17:20:52.392517 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.251953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.252017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.252162 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.252297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.252660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:54 crc kubenswrapper[4795]: I0320 17:20:54.252976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.252934 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:54 crc kubenswrapper[4795]: E0320 17:20:54.253094 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252182 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.252543 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:56 crc kubenswrapper[4795]: I0320 17:20:56.252601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.252805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.252985 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:56 crc kubenswrapper[4795]: E0320 17:20:56.253119 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:57 crc kubenswrapper[4795]: I0320 17:20:57.253990 4795 scope.go:117] "RemoveContainer" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" Mar 20 17:20:57 crc kubenswrapper[4795]: E0320 17:20:57.393247 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251440 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.251557 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.251966 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.251802 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.251659 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:20:58 crc kubenswrapper[4795]: E0320 17:20:58.252122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.282161 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:20:58 crc kubenswrapper[4795]: I0320 17:20:58.282263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7"} Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.251314 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.251393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.251523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.251778 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.252373 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.252568 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.252876 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:21:00 crc kubenswrapper[4795]: I0320 17:21:00.253045 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:00 crc kubenswrapper[4795]: E0320 17:21:00.253296 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.189851 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpp4c"] Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.190002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:01 crc kubenswrapper[4795]: E0320 17:21:01.190153 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.299222 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.302767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerStarted","Data":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.303346 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:21:01 crc kubenswrapper[4795]: I0320 17:21:01.343934 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podStartSLOduration=148.343903467 podStartE2EDuration="2m28.343903467s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:01.342869101 +0000 UTC m=+204.800900732" watchObservedRunningTime="2026-03-20 17:21:01.343903467 +0000 UTC m=+204.801935048" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216071 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216369 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.21633232 +0000 UTC m=+327.674363901 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.216611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216628 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216763 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.216747174 +0000 UTC m=+327.674778745 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216901 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216928 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216920 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.216946 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.217069 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.217033614 +0000 UTC m=+327.675065265 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.217147 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.217128098 +0000 UTC m=+327.675159669 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.251285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.251399 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.251302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.251509 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.251651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.251810 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:02 crc kubenswrapper[4795]: I0320 17:21:02.317579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.317888 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.317923 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.317943 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.318053 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:23:04.31802719 +0000 UTC m=+327.776058761 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:21:02 crc kubenswrapper[4795]: E0320 17:21:02.395197 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 17:21:03 crc kubenswrapper[4795]: I0320 17:21:03.251358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:03 crc kubenswrapper[4795]: E0320 17:21:03.251571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:04 crc kubenswrapper[4795]: I0320 17:21:04.252042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:04 crc kubenswrapper[4795]: I0320 17:21:04.252048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:04 crc kubenswrapper[4795]: I0320 17:21:04.252238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:04 crc kubenswrapper[4795]: E0320 17:21:04.252245 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:04 crc kubenswrapper[4795]: E0320 17:21:04.253076 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:04 crc kubenswrapper[4795]: E0320 17:21:04.253403 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:05 crc kubenswrapper[4795]: I0320 17:21:05.252144 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:05 crc kubenswrapper[4795]: E0320 17:21:05.252313 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:06 crc kubenswrapper[4795]: I0320 17:21:06.252274 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:06 crc kubenswrapper[4795]: I0320 17:21:06.252396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:06 crc kubenswrapper[4795]: I0320 17:21:06.252281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:06 crc kubenswrapper[4795]: E0320 17:21:06.252473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:21:06 crc kubenswrapper[4795]: E0320 17:21:06.252579 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:21:06 crc kubenswrapper[4795]: E0320 17:21:06.252732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:21:07 crc kubenswrapper[4795]: I0320 17:21:07.251620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:07 crc kubenswrapper[4795]: E0320 17:21:07.253609 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpp4c" podUID="996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.251265 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.251366 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.251367 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.253743 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.253775 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.254389 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:21:08 crc kubenswrapper[4795]: I0320 17:21:08.254490 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:21:09 crc kubenswrapper[4795]: I0320 17:21:09.252109 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:09 crc kubenswrapper[4795]: I0320 17:21:09.262023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:21:09 crc kubenswrapper[4795]: I0320 17:21:09.262459 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.301098 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.301460 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.526510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.597632 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bl2bp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.598817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.610001 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.611113 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.612040 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.619468 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.621538 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.621900 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.622924 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.623160 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.623423 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.636427 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.637508 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.638350 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.639077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.639339 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.639780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.641394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.641721 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.642354 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-p5hmr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.642634 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.643076 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.643084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.644326 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.648495 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xzx7n"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.649201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.650052 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.652170 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.652843 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.653965 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.654265 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.654374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.654953 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.655526 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.655736 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656203 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.656996 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.657117 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.658010 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.658453 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.661810 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.662774 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.663618 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97wlq"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.664401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.665084 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5l8ml"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.665566 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.666189 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gplds"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.666823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.669230 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.669670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.670862 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.671387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.673310 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.674024 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.677814 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.678258 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-45pjp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.678442 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.678756 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.685544 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686010 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686675 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.686990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.687159 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.688445 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.688733 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.688959 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.689237 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.692910 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.693272 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.693825 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.694130 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.694238 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.703353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.693920 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.704133 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.704869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.705015 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.706346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.706789 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.707420 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.711023 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.708799 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712435 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710642 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrcc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.714570 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709307 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709436 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709614 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.709659 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710386 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.710449 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.712300 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.714991 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.715311 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.715672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.715912 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.717114 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.717363 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.730854 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.731068 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.731248 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.731419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.736576 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.736948 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.737606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.737801 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.737887 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.738052 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.738163 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.738892 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.739976 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740247 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-images\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-audit\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740384 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-node-pullsecrets\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740531 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740799 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.741432 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.740534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-encryption-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-serving-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-image-import-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8l7c\" (UniqueName: \"kubernetes.io/projected/9f31b9ac-9447-4b20-ac60-7532edfa4600-kube-api-access-q8l7c\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742578 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-client\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-audit-dir\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-serving-cert\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f31b9ac-9447-4b20-ac60-7532edfa4600-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9ptd\" (UniqueName: \"kubernetes.io/projected/5c21571e-5513-46e0-9eed-4ec64df8e445-kube-api-access-w9ptd\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-config\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.742896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.743025 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.743145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.743361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.744028 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.744945 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.746698 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.748890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.749535 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.749631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.751831 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bl2bp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.752239 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.753051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.753814 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.754007 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.754081 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.754234 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.758193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.760338 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.760642 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lrxrs"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761137 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761252 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.761763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.763942 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.764452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.781769 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.782762 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.784380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.785572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xzx7n"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.789333 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.789723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gpx9r"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.790459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.794341 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.794886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.795534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.796447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.796564 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.797019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.797768 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.798616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.798847 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.800070 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.800403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c49vv"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802520 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.802855 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.803578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.804863 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.805632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.806008 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.806741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.806787 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.807592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.809291 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgsw2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.809778 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.810746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.811430 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.811916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.812628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.812989 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.813889 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8v58t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.814572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.815843 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-45pjp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.816866 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.818086 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.819391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.820811 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.821105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.822410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5l8ml"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.823725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.825492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrcc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.826633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-p5hmr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.828267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.829560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.830560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.831532 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.832711 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c49vv"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.834057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.835123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.836473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.837763 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97wlq"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.838733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.840906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.841204 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.843305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92f4\" (UniqueName: \"kubernetes.io/projected/0415738e-f327-433a-9a28-0a991138e021-kube-api-access-n92f4\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-image-import-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8l7c\" (UniqueName: \"kubernetes.io/projected/9f31b9ac-9447-4b20-ac60-7532edfa4600-kube-api-access-q8l7c\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89cb82f-a141-419f-bf33-93c219c84e51-metrics-tls\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-client\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5scj\" (UniqueName: \"kubernetes.io/projected/2dbe21c7-d209-4259-b51d-b486b741e9c7-kube-api-access-c5scj\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-audit-dir\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844631 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f292b8-878f-418e-8c85-2f7818e9dba1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-machine-approver-tls\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844725 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-serving-cert\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.844837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-audit-dir\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.845129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-serving-cert\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.845171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f31b9ac-9447-4b20-ac60-7532edfa4600-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.845717 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-image-import-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9ptd\" (UniqueName: \"kubernetes.io/projected/5c21571e-5513-46e0-9eed-4ec64df8e445-kube-api-access-w9ptd\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-encryption-config\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f292b8-878f-418e-8c85-2f7818e9dba1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89cb82f-a141-419f-bf33-93c219c84e51-trusted-ca\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0415738e-f327-433a-9a28-0a991138e021-audit-dir\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-config\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846607 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dbe21c7-d209-4259-b51d-b486b741e9c7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-images\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.846747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6pt\" (UniqueName: \"kubernetes.io/projected/e6f292b8-878f-418e-8c85-2f7818e9dba1-kube-api-access-7n6pt\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-audit\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847722 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.847863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.851113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.848212 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-images\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.848819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-audit\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.850750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-client\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.849231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.850797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.852169 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.852520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.853056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.853134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f31b9ac-9447-4b20-ac60-7532edfa4600-config\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.854518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gvs\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-kube-api-access-g5gvs\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.854646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gpx9r"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.854894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.855884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-auth-proxy-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb25j\" (UniqueName: \"kubernetes.io/projected/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-kube-api-access-lb25j\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-etcd-client\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-encryption-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-node-pullsecrets\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-audit-policies\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-serving-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.857949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbe21c7-d209-4259-b51d-b486b741e9c7-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.858411 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.858495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5c21571e-5513-46e0-9eed-4ec64df8e445-node-pullsecrets\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.858760 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.859283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.859390 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.859830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5c21571e-5513-46e0-9eed-4ec64df8e445-etcd-serving-ca\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.860817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.861202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f31b9ac-9447-4b20-ac60-7532edfa4600-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.861223 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.861766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-encryption-config\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.863134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c21571e-5513-46e0-9eed-4ec64df8e445-serving-cert\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.863438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.864020 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.865303 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.866448 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgsw2"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.867831 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.868982 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-454wp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.870837 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.870916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.872928 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5v822"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.873893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.874264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.875485 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.876903 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-454wp"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.878575 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5v822"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.880049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gdx4t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.881600 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.882327 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdx4t"] Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.882403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.900921 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.921126 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.941350 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.958669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-encryption-config\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.958846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f292b8-878f-418e-8c85-2f7818e9dba1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.958946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0415738e-f327-433a-9a28-0a991138e021-audit-dir\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89cb82f-a141-419f-bf33-93c219c84e51-trusted-ca\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dbe21c7-d209-4259-b51d-b486b741e9c7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959614 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dbe21c7-d209-4259-b51d-b486b741e9c7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959741 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6pt\" (UniqueName: \"kubernetes.io/projected/e6f292b8-878f-418e-8c85-2f7818e9dba1-kube-api-access-7n6pt\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gvs\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-kube-api-access-g5gvs\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-auth-proxy-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb25j\" (UniqueName: \"kubernetes.io/projected/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-kube-api-access-lb25j\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-etcd-client\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-audit-policies\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960164 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbe21c7-d209-4259-b51d-b486b741e9c7-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960181 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960195 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92f4\" (UniqueName: \"kubernetes.io/projected/0415738e-f327-433a-9a28-0a991138e021-kube-api-access-n92f4\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.959107 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0415738e-f327-433a-9a28-0a991138e021-audit-dir\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89cb82f-a141-419f-bf33-93c219c84e51-metrics-tls\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960287 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5scj\" (UniqueName: \"kubernetes.io/projected/2dbe21c7-d209-4259-b51d-b486b741e9c7-kube-api-access-c5scj\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f292b8-878f-418e-8c85-2f7818e9dba1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-serving-cert\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-machine-approver-tls\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-auth-proxy-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.960955 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.961044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-encryption-config\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.961440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.962162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-config\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.962197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0415738e-f327-433a-9a28-0a991138e021-audit-policies\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.962258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6f292b8-878f-418e-8c85-2f7818e9dba1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.963206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c89cb82f-a141-419f-bf33-93c219c84e51-trusted-ca\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.963475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-machine-approver-tls\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.964070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-etcd-client\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.964495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89cb82f-a141-419f-bf33-93c219c84e51-metrics-tls\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.964985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0415738e-f327-433a-9a28-0a991138e021-serving-cert\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.965261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dbe21c7-d209-4259-b51d-b486b741e9c7-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.973427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f292b8-878f-418e-8c85-2f7818e9dba1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:11 crc kubenswrapper[4795]: I0320 17:21:11.982145 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.003929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.021929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.041380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.061557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.081604 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.101210 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.120841 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.162263 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.181349 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.202779 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.222951 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.253926 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.261472 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.282012 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.301997 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.322586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.342817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.362466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.382001 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.402491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.421041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.442641 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.462300 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.482167 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.503876 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.541834 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.563193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.582232 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.601032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.621969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.641616 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.662038 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.682771 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.702448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.721816 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.741533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.761811 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.782644 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.802977 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.820459 4795 request.go:700] Waited for 1.016499818s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.822815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.842129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.862163 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.883075 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.902807 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.922495 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.942511 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:21:12 crc kubenswrapper[4795]: I0320 17:21:12.961880 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.003586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.003950 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.021290 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.042042 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.062600 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.081880 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.103889 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.121834 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.142094 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.162476 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.181945 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.203208 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.221927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.241991 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.261929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.282197 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.302089 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.322233 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.342384 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.362520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.382340 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.402468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.421950 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.441909 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.461835 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.481764 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.502099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.551091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8l7c\" (UniqueName: \"kubernetes.io/projected/9f31b9ac-9447-4b20-ac60-7532edfa4600-kube-api-access-q8l7c\") pod \"machine-api-operator-5694c8668f-p5hmr\" (UID: \"9f31b9ac-9447-4b20-ac60-7532edfa4600\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.571652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9ptd\" (UniqueName: \"kubernetes.io/projected/5c21571e-5513-46e0-9eed-4ec64df8e445-kube-api-access-w9ptd\") pod \"apiserver-76f77b778f-bl2bp\" (UID: \"5c21571e-5513-46e0-9eed-4ec64df8e445\") " pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.589331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"console-f9d7485db-hn4r8\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.604100 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.609104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"controller-manager-879f6c89f-nww6d\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.622649 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.642303 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.662651 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.682303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.701687 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.722534 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.741636 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.741783 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.761973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.765017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.781923 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.786662 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.803749 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.835197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.840207 4795 request.go:700] Waited for 1.879866724s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.849764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gvs\" (UniqueName: \"kubernetes.io/projected/c89cb82f-a141-419f-bf33-93c219c84e51-kube-api-access-g5gvs\") pod \"ingress-operator-5b745b69d9-z5ls6\" (UID: \"c89cb82f-a141-419f-bf33-93c219c84e51\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.878402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6pt\" (UniqueName: \"kubernetes.io/projected/e6f292b8-878f-418e-8c85-2f7818e9dba1-kube-api-access-7n6pt\") pod \"openshift-controller-manager-operator-756b6f6bc6-4rrsb\" (UID: \"e6f292b8-878f-418e-8c85-2f7818e9dba1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.902339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.904690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb25j\" (UniqueName: \"kubernetes.io/projected/bfa1fcc6-c9f8-4928-8a95-6c418323dd69-kube-api-access-lb25j\") pod \"machine-approver-56656f9798-gplds\" (UID: \"bfa1fcc6-c9f8-4928-8a95-6c418323dd69\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.908829 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92f4\" (UniqueName: \"kubernetes.io/projected/0415738e-f327-433a-9a28-0a991138e021-kube-api-access-n92f4\") pod \"apiserver-7bbb656c7d-p42nc\" (UID: \"0415738e-f327-433a-9a28-0a991138e021\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.926592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.933296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5scj\" (UniqueName: \"kubernetes.io/projected/2dbe21c7-d209-4259-b51d-b486b741e9c7-kube-api-access-c5scj\") pod \"openshift-config-operator-7777fb866f-p9xgc\" (UID: \"2dbe21c7-d209-4259-b51d-b486b741e9c7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.951187 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983554 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.983981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0589e639-75bf-4a26-a80b-dbb69a6c9955-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2spg\" (UniqueName: \"kubernetes.io/projected/e5e80a44-9bdc-4321-9536-8eba4527f181-kube-api-access-s2spg\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmx2\" (UniqueName: \"kubernetes.io/projected/5c603995-8326-4bea-892a-74ee1e8c8dea-kube-api-access-bzmx2\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr9b7\" (UniqueName: \"kubernetes.io/projected/109e018a-9ad5-40e6-bd49-07d49d718161-kube-api-access-cr9b7\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-serving-cert\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427xq\" (UniqueName: \"kubernetes.io/projected/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-kube-api-access-427xq\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-serving-cert\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e639-75bf-4a26-a80b-dbb69a6c9955-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984444 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/32dc8fa2-0199-444e-9983-4af0fb9172b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shn6h\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-kube-api-access-shn6h\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-config\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.984988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: E0320 17:21:13.985261 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.485247731 +0000 UTC m=+217.943279282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985356 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-trusted-ca\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mhf6\" (UniqueName: \"kubernetes.io/projected/7074cf98-12f4-4a73-ad96-4959f64398a7-kube-api-access-5mhf6\") pod \"downloads-7954f5f757-xzx7n\" (UID: \"7074cf98-12f4-4a73-ad96-4959f64398a7\") " pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2669j\" (UniqueName: \"kubernetes.io/projected/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-kube-api-access-2669j\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985851 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2qp\" (UniqueName: \"kubernetes.io/projected/32dc8fa2-0199-444e-9983-4af0fb9172b1-kube-api-access-8c2qp\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.985942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-config\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-client\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/109e018a-9ad5-40e6-bd49-07d49d718161-metrics-tls\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-config\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.986634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c603995-8326-4bea-892a-74ee1e8c8dea-serving-cert\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:13 crc kubenswrapper[4795]: I0320 17:21:13.998207 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.004103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.064848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bl2bp"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087372 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe2960a0-9218-4d46-8c50-7285c5e27882-proxy-tls\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587836f8-b700-43d0-940e-81d7820b2a6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-config-volume\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-srv-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-images\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmx2\" (UniqueName: \"kubernetes.io/projected/5c603995-8326-4bea-892a-74ee1e8c8dea-kube-api-access-bzmx2\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"auto-csr-approver-29567120-j7789\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbs2f\" (UniqueName: \"kubernetes.io/projected/5611db8a-18df-426e-a6e7-7f6720da4109-kube-api-access-nbs2f\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-stats-auth\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587836f8-b700-43d0-940e-81d7820b2a6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-metrics-certs\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087773 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jmdf\" (UniqueName: \"kubernetes.io/projected/661df377-ed57-4f75-9be9-3fc5f87cf37e-kube-api-access-4jmdf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqj4\" (UniqueName: \"kubernetes.io/projected/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-kube-api-access-wvqj4\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-serving-cert\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shn6h\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-kube-api-access-shn6h\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0068e5c-7377-479d-9cc5-fd1270c74b33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-webhook-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.087997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xln6k\" (UniqueName: \"kubernetes.io/projected/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-kube-api-access-xln6k\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-profile-collector-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2qp\" (UniqueName: \"kubernetes.io/projected/32dc8fa2-0199-444e-9983-4af0fb9172b1-kube-api-access-8c2qp\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-client\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088168 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbc5q\" (UniqueName: \"kubernetes.io/projected/ff31d5af-4eae-43e7-8512-c6f5d54501e1-kube-api-access-lbc5q\") pod \"migrator-59844c95c7-q9zkh\" (UID: \"ff31d5af-4eae-43e7-8512-c6f5d54501e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a61cce87-0b4f-4886-a347-b98aecad272a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-config\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2da6bf3-e17d-4adf-96dd-ea097cae192b-proxy-tls\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c603995-8326-4bea-892a-74ee1e8c8dea-serving-cert\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/661df377-ed57-4f75-9be9-3fc5f87cf37e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-registration-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc57139-c6ad-4639-a09f-d07f8da49f4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-node-bootstrap-token\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfz4k\" (UniqueName: \"kubernetes.io/projected/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-kube-api-access-jfz4k\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-metrics-tls\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0589e639-75bf-4a26-a80b-dbb69a6c9955-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbkf\" (UniqueName: \"kubernetes.io/projected/7479d10c-1c3b-497e-8dda-07cd22aeccf0-kube-api-access-qtbkf\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.088614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2spg\" (UniqueName: \"kubernetes.io/projected/e5e80a44-9bdc-4321-9536-8eba4527f181-kube-api-access-s2spg\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5611db8a-18df-426e-a6e7-7f6720da4109-tmpfs\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlnwq\" (UniqueName: \"kubernetes.io/projected/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-kube-api-access-qlnwq\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.089900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-default-certificate\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090284 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-mountpoint-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090413 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr9b7\" (UniqueName: \"kubernetes.io/projected/109e018a-9ad5-40e6-bd49-07d49d718161-kube-api-access-cr9b7\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428476fd-a8f2-4ffc-bda6-f19da80778ac-cert\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-serving-cert\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427xq\" (UniqueName: \"kubernetes.io/projected/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-kube-api-access-427xq\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-serving-cert\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e639-75bf-4a26-a80b-dbb69a6c9955-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/32dc8fa2-0199-444e-9983-4af0fb9172b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-certs\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/661df377-ed57-4f75-9be9-3fc5f87cf37e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090781 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-config\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jpd\" (UniqueName: \"kubernetes.io/projected/b4698fe3-a607-4978-bad4-5b83d3beb21b-kube-api-access-k2jpd\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.090971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bphg\" (UniqueName: \"kubernetes.io/projected/489757b2-0de4-4275-8931-daa5c3b4a75a-kube-api-access-2bphg\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091014 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-trusted-ca\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mhf6\" (UniqueName: \"kubernetes.io/projected/7074cf98-12f4-4a73-ad96-4959f64398a7-kube-api-access-5mhf6\") pod \"downloads-7954f5f757-xzx7n\" (UID: \"7074cf98-12f4-4a73-ad96-4959f64398a7\") " pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd81e-16ce-4b5c-8667-85c9426a9221-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsz6b\" (UniqueName: \"kubernetes.io/projected/ae2bd81e-16ce-4b5c-8667-85c9426a9221-kube-api-access-gsz6b\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091142 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091165 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2669j\" (UniqueName: \"kubernetes.io/projected/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-kube-api-access-2669j\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091190 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-apiservice-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7dc57139-c6ad-4639-a09f-d07f8da49f4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7c4k\" (UniqueName: \"kubernetes.io/projected/067aa008-8dda-4bfe-bfd2-388abdb54299-kube-api-access-h7c4k\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091330 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fj9p\" (UniqueName: \"kubernetes.io/projected/a61cce87-0b4f-4886-a347-b98aecad272a-kube-api-access-8fj9p\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-config\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0068e5c-7377-479d-9cc5-fd1270c74b33-config\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.091478 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.591461754 +0000 UTC m=+218.049493295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-srv-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0068e5c-7377-479d-9cc5-fd1270c74b33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/109e018a-9ad5-40e6-bd49-07d49d718161-metrics-tls\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-key\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9cd4\" (UniqueName: \"kubernetes.io/projected/fe2960a0-9218-4d46-8c50-7285c5e27882-kube-api-access-m9cd4\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-config\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pxd\" (UniqueName: \"kubernetes.io/projected/428476fd-a8f2-4ffc-bda6-f19da80778ac-kube-api-access-c2pxd\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-plugins-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.091990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587836f8-b700-43d0-940e-81d7820b2a6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.092095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093479 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg7w2\" (UniqueName: \"kubernetes.io/projected/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-kube-api-access-mg7w2\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/067aa008-8dda-4bfe-bfd2-388abdb54299-service-ca-bundle\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc57139-c6ad-4639-a09f-d07f8da49f4e-config\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2da6bf3-e17d-4adf-96dd-ea097cae192b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgxw\" (UniqueName: \"kubernetes.io/projected/d2da6bf3-e17d-4adf-96dd-ea097cae192b-kube-api-access-tmgxw\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-csi-data-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.093856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-config\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.094935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.095505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.095657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.095979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.096534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.096575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.096664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.097091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-config\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.097657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c603995-8326-4bea-892a-74ee1e8c8dea-trusted-ca\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.098109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.098621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-service-ca\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.099672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.099709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-etcd-client\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0589e639-75bf-4a26-a80b-dbb69a6c9955-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-socket-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.100881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-config\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.101218 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.101668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.103187 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/109e018a-9ad5-40e6-bd49-07d49d718161-metrics-tls\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.104557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-serving-cert\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.105821 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e639-75bf-4a26-a80b-dbb69a6c9955-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.106342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/32dc8fa2-0199-444e-9983-4af0fb9172b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.106608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c603995-8326-4bea-892a-74ee1e8c8dea-serving-cert\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.106984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111518 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111626 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e80a44-9bdc-4321-9536-8eba4527f181-serving-cert\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.111853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.112322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.112591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.117965 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2spg\" (UniqueName: \"kubernetes.io/projected/e5e80a44-9bdc-4321-9536-8eba4527f181-kube-api-access-s2spg\") pod \"etcd-operator-b45778765-cdrcc\" (UID: \"e5e80a44-9bdc-4321-9536-8eba4527f181\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.132302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-p5hmr"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.136059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shn6h\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-kube-api-access-shn6h\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.145790 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.159803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.173330 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f31b9ac_9447_4b20_ac60_7532edfa4600.slice/crio-bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b WatchSource:0}: Error finding container bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b: Status 404 returned error can't find the container with id bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.180134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2qp\" (UniqueName: \"kubernetes.io/projected/32dc8fa2-0199-444e-9983-4af0fb9172b1-kube-api-access-8c2qp\") pod \"cluster-samples-operator-665b6dd947-g5zjc\" (UID: \"32dc8fa2-0199-444e-9983-4af0fb9172b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.180948 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.193819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"oauth-openshift-558db77b4-mmtf7\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.199080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-profile-collector-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbc5q\" (UniqueName: \"kubernetes.io/projected/ff31d5af-4eae-43e7-8512-c6f5d54501e1-kube-api-access-lbc5q\") pod \"migrator-59844c95c7-q9zkh\" (UID: \"ff31d5af-4eae-43e7-8512-c6f5d54501e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a61cce87-0b4f-4886-a347-b98aecad272a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-config\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2da6bf3-e17d-4adf-96dd-ea097cae192b-proxy-tls\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/661df377-ed57-4f75-9be9-3fc5f87cf37e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-registration-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201423 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc57139-c6ad-4639-a09f-d07f8da49f4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-node-bootstrap-token\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201454 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfz4k\" (UniqueName: \"kubernetes.io/projected/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-kube-api-access-jfz4k\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-metrics-tls\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbkf\" (UniqueName: \"kubernetes.io/projected/7479d10c-1c3b-497e-8dda-07cd22aeccf0-kube-api-access-qtbkf\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5611db8a-18df-426e-a6e7-7f6720da4109-tmpfs\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlnwq\" (UniqueName: \"kubernetes.io/projected/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-kube-api-access-qlnwq\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-default-certificate\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-mountpoint-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428476fd-a8f2-4ffc-bda6-f19da80778ac-cert\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-certs\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/661df377-ed57-4f75-9be9-3fc5f87cf37e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201652 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jpd\" (UniqueName: \"kubernetes.io/projected/b4698fe3-a607-4978-bad4-5b83d3beb21b-kube-api-access-k2jpd\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bphg\" (UniqueName: \"kubernetes.io/projected/489757b2-0de4-4275-8931-daa5c3b4a75a-kube-api-access-2bphg\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd81e-16ce-4b5c-8667-85c9426a9221-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.201997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsz6b\" (UniqueName: \"kubernetes.io/projected/ae2bd81e-16ce-4b5c-8667-85c9426a9221-kube-api-access-gsz6b\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-apiservice-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7dc57139-c6ad-4639-a09f-d07f8da49f4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7c4k\" (UniqueName: \"kubernetes.io/projected/067aa008-8dda-4bfe-bfd2-388abdb54299-kube-api-access-h7c4k\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fj9p\" (UniqueName: \"kubernetes.io/projected/a61cce87-0b4f-4886-a347-b98aecad272a-kube-api-access-8fj9p\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0068e5c-7377-479d-9cc5-fd1270c74b33-config\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-srv-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0068e5c-7377-479d-9cc5-fd1270c74b33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-key\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202174 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9cd4\" (UniqueName: \"kubernetes.io/projected/fe2960a0-9218-4d46-8c50-7285c5e27882-kube-api-access-m9cd4\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pxd\" (UniqueName: \"kubernetes.io/projected/428476fd-a8f2-4ffc-bda6-f19da80778ac-kube-api-access-c2pxd\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-plugins-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587836f8-b700-43d0-940e-81d7820b2a6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg7w2\" (UniqueName: \"kubernetes.io/projected/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-kube-api-access-mg7w2\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/067aa008-8dda-4bfe-bfd2-388abdb54299-service-ca-bundle\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202287 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc57139-c6ad-4639-a09f-d07f8da49f4e-config\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2da6bf3-e17d-4adf-96dd-ea097cae192b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgxw\" (UniqueName: \"kubernetes.io/projected/d2da6bf3-e17d-4adf-96dd-ea097cae192b-kube-api-access-tmgxw\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-csi-data-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-socket-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe2960a0-9218-4d46-8c50-7285c5e27882-proxy-tls\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587836f8-b700-43d0-940e-81d7820b2a6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-config-volume\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-srv-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-images\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"auto-csr-approver-29567120-j7789\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbs2f\" (UniqueName: \"kubernetes.io/projected/5611db8a-18df-426e-a6e7-7f6720da4109-kube-api-access-nbs2f\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-stats-auth\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587836f8-b700-43d0-940e-81d7820b2a6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-metrics-certs\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jmdf\" (UniqueName: \"kubernetes.io/projected/661df377-ed57-4f75-9be9-3fc5f87cf37e-kube-api-access-4jmdf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqj4\" (UniqueName: \"kubernetes.io/projected/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-kube-api-access-wvqj4\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-serving-cert\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0068e5c-7377-479d-9cc5-fd1270c74b33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-webhook-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.202729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xln6k\" (UniqueName: \"kubernetes.io/projected/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-kube-api-access-xln6k\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.203622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-registration-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.203844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/067aa008-8dda-4bfe-bfd2-388abdb54299-service-ca-bundle\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.204332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-mountpoint-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.205394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-profile-collector-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.205904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.206045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.206578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc57139-c6ad-4639-a09f-d07f8da49f4e-config\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.207255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2da6bf3-e17d-4adf-96dd-ea097cae192b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.207399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-csi-data-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.207948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7dc57139-c6ad-4639-a09f-d07f8da49f4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.208379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-default-certificate\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.208814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-socket-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.209408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0068e5c-7377-479d-9cc5-fd1270c74b33-config\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.209648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/489757b2-0de4-4275-8931-daa5c3b4a75a-plugins-dir\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.210141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/661df377-ed57-4f75-9be9-3fc5f87cf37e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.210405 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.710391344 +0000 UTC m=+218.168422885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.210586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.211805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2da6bf3-e17d-4adf-96dd-ea097cae192b-proxy-tls\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.211855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a61cce87-0b4f-4886-a347-b98aecad272a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.212326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-config\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.212470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587836f8-b700-43d0-940e-81d7820b2a6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.212737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b4698fe3-a607-4978-bad4-5b83d3beb21b-srv-cert\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.213121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-metrics-tls\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.213304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-stats-auth\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-certs\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-node-bootstrap-token\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/661df377-ed57-4f75-9be9-3fc5f87cf37e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd81e-16ce-4b5c-8667-85c9426a9221-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.214596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5611db8a-18df-426e-a6e7-7f6720da4109-tmpfs\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.215126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-images\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.215651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-config-volume\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fe2960a0-9218-4d46-8c50-7285c5e27882-proxy-tls\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-webhook-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.216667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0068e5c-7377-479d-9cc5-fd1270c74b33-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.217279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/067aa008-8dda-4bfe-bfd2-388abdb54299-metrics-certs\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.217628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/428476fd-a8f2-4ffc-bda6-f19da80778ac-cert\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.217816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-serving-cert\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.221109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7479d10c-1c3b-497e-8dda-07cd22aeccf0-signing-key\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.221752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fe2960a0-9218-4d46-8c50-7285c5e27882-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.222142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5611db8a-18df-426e-a6e7-7f6720da4109-apiservice-cert\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.224212 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dbe21c7_d209_4259_b51d_b486b741e9c7.slice/crio-66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186 WatchSource:0}: Error finding container 66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186: Status 404 returned error can't find the container with id 66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.224292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587836f8-b700-43d0-940e-81d7820b2a6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.224621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-srv-cert\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.224889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e639-75bf-4a26-a80b-dbb69a6c9955-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mrhs2\" (UID: \"0589e639-75bf-4a26-a80b-dbb69a6c9955\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.225726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.228651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.231111 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.234162 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.246185 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.262328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmx2\" (UniqueName: \"kubernetes.io/projected/5c603995-8326-4bea-892a-74ee1e8c8dea-kube-api-access-bzmx2\") pod \"console-operator-58897d9998-97wlq\" (UID: \"5c603995-8326-4bea-892a-74ee1e8c8dea\") " pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.279848 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89cb82f_a141_419f_bf33_93c219c84e51.slice/crio-b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236 WatchSource:0}: Error finding container b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236: Status 404 returned error can't find the container with id b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.280348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr9b7\" (UniqueName: \"kubernetes.io/projected/109e018a-9ad5-40e6-bd49-07d49d718161-kube-api-access-cr9b7\") pod \"dns-operator-744455d44c-45pjp\" (UID: \"109e018a-9ad5-40e6-bd49-07d49d718161\") " pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.281075 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.288153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.289976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.295556 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mhf6\" (UniqueName: \"kubernetes.io/projected/7074cf98-12f4-4a73-ad96-4959f64398a7-kube-api-access-5mhf6\") pod \"downloads-7954f5f757-xzx7n\" (UID: \"7074cf98-12f4-4a73-ad96-4959f64398a7\") " pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.303922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.304354 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.804341096 +0000 UTC m=+218.262372627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.310193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.319325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2669j\" (UniqueName: \"kubernetes.io/projected/7ff4d197-bbb9-45f9-9cdc-0c76079aa9db-kube-api-access-2669j\") pod \"openshift-apiserver-operator-796bbdcf4f-jrpxl\" (UID: \"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.340991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"route-controller-manager-6576b87f9c-g9lr4\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.366058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427xq\" (UniqueName: \"kubernetes.io/projected/4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc-kube-api-access-427xq\") pod \"authentication-operator-69f744f599-5l8ml\" (UID: \"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.383552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.389650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" event={"ID":"bfa1fcc6-c9f8-4928-8a95-6c418323dd69","Type":"ContainerStarted","Data":"86af0d43185f7143b9a376dbde8fe9646e5d2b9422bd0f4998a07d587413a525"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.400982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" event={"ID":"0415738e-f327-433a-9a28-0a991138e021","Type":"ContainerStarted","Data":"f1f5ee537c22d825625f686e76d46e48805d3ce0195d22ee7f088e9d51e0315f"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.405929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.406109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlnwq\" (UniqueName: \"kubernetes.io/projected/cd9b8a97-1b9d-4365-a985-a02d4078e3c2-kube-api-access-qlnwq\") pod \"control-plane-machine-set-operator-78cbb6b69f-knn77\" (UID: \"cd9b8a97-1b9d-4365-a985-a02d4078e3c2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.406496 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:14.906475582 +0000 UTC m=+218.364507123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.420849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" event={"ID":"9f31b9ac-9447-4b20-ac60-7532edfa4600","Type":"ContainerStarted","Data":"bb67d7239ac81c201272fb1b9adeb1ecbf97d522b376528a2f9bfe7e33a8004b"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.422998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xln6k\" (UniqueName: \"kubernetes.io/projected/9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb-kube-api-access-xln6k\") pod \"service-ca-operator-777779d784-c49vv\" (UID: \"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.428186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerStarted","Data":"c7f65d1274bb19079f9f79351a782d7495541a1ecdc8d88a866af54812721807"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.436875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerStarted","Data":"df19658d353c1fec007be00d0c32bf3d267bac688d1e2b796e081a83b42f3010"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.441828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"marketplace-operator-79b997595-clvzs\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.441834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerStarted","Data":"8dc54184aa3235d402105c8c10b4efe9d346fbed9538a68e181e2046d4944844"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.442284 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.443593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" event={"ID":"c89cb82f-a141-419f-bf33-93c219c84e51","Type":"ContainerStarted","Data":"b46aca90a25373a57866a7839140d6fd7b785d259391a93337547651dcda0236"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.443908 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nww6d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.443932 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.444657 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" event={"ID":"e6f292b8-878f-418e-8c85-2f7818e9dba1","Type":"ContainerStarted","Data":"506917fa822eba8eeccd36c6e28b51af1e4e1e68c026c93680ba389b83e3ae85"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.482704 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbc5q\" (UniqueName: \"kubernetes.io/projected/ff31d5af-4eae-43e7-8512-c6f5d54501e1-kube-api-access-lbc5q\") pod \"migrator-59844c95c7-q9zkh\" (UID: \"ff31d5af-4eae-43e7-8512-c6f5d54501e1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.482785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7dc57139-c6ad-4639-a09f-d07f8da49f4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pckpf\" (UID: \"7dc57139-c6ad-4639-a09f-d07f8da49f4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.483165 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.483762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.484994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" event={"ID":"2dbe21c7-d209-4259-b51d-b486b741e9c7","Type":"ContainerStarted","Data":"66e52cd7208c95fcbd71fb1927add0955601cd0ef07bbfdc1ed028ec10aff186"} Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.504887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bphg\" (UniqueName: \"kubernetes.io/projected/489757b2-0de4-4275-8931-daa5c3b4a75a-kube-api-access-2bphg\") pod \"csi-hostpathplugin-454wp\" (UID: \"489757b2-0de4-4275-8931-daa5c3b4a75a\") " pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.506738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.507528 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.007232568 +0000 UTC m=+218.465264099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.507911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.508360 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.008347329 +0000 UTC m=+218.466378870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.510114 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.519774 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.520423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.528959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgxw\" (UniqueName: \"kubernetes.io/projected/d2da6bf3-e17d-4adf-96dd-ea097cae192b-kube-api-access-tmgxw\") pod \"machine-config-controller-84d6567774-8jwc2\" (UID: \"d2da6bf3-e17d-4adf-96dd-ea097cae192b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.540395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsz6b\" (UniqueName: \"kubernetes.io/projected/ae2bd81e-16ce-4b5c-8667-85c9426a9221-kube-api-access-gsz6b\") pod \"multus-admission-controller-857f4d67dd-gpx9r\" (UID: \"ae2bd81e-16ce-4b5c-8667-85c9426a9221\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.572784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7c4k\" (UniqueName: \"kubernetes.io/projected/067aa008-8dda-4bfe-bfd2-388abdb54299-kube-api-access-h7c4k\") pod \"router-default-5444994796-lrxrs\" (UID: \"067aa008-8dda-4bfe-bfd2-388abdb54299\") " pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.591265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fj9p\" (UniqueName: \"kubernetes.io/projected/a61cce87-0b4f-4886-a347-b98aecad272a-kube-api-access-8fj9p\") pod \"package-server-manager-789f6589d5-qj44k\" (UID: \"a61cce87-0b4f-4886-a347-b98aecad272a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.598277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfz4k\" (UniqueName: \"kubernetes.io/projected/9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0-kube-api-access-jfz4k\") pod \"machine-config-server-8v58t\" (UID: \"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0\") " pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.609375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.612485 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.112457397 +0000 UTC m=+218.570489038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.614484 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.622959 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-45pjp"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.623637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.627489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbs2f\" (UniqueName: \"kubernetes.io/projected/5611db8a-18df-426e-a6e7-7f6720da4109-kube-api-access-nbs2f\") pod \"packageserver-d55dfcdfc-c2p9f\" (UID: \"5611db8a-18df-426e-a6e7-7f6720da4109\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.638298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.648033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.659282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9cd4\" (UniqueName: \"kubernetes.io/projected/fe2960a0-9218-4d46-8c50-7285c5e27882-kube-api-access-m9cd4\") pod \"machine-config-operator-74547568cd-kzt8c\" (UID: \"fe2960a0-9218-4d46-8c50-7285c5e27882\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.660446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.662419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pxd\" (UniqueName: \"kubernetes.io/projected/428476fd-a8f2-4ffc-bda6-f19da80778ac-kube-api-access-c2pxd\") pod \"ingress-canary-gdx4t\" (UID: \"428476fd-a8f2-4ffc-bda6-f19da80778ac\") " pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.663447 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109e018a_9ad5_40e6_bd49_07d49d718161.slice/crio-b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521 WatchSource:0}: Error finding container b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521: Status 404 returned error can't find the container with id b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.687893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.693928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.701521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg7w2\" (UniqueName: \"kubernetes.io/projected/d1a9c8a4-d7c9-4365-8516-465b89c76ea8-kube-api-access-mg7w2\") pod \"olm-operator-6b444d44fb-wwjgw\" (UID: \"d1a9c8a4-d7c9-4365-8516-465b89c76ea8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.701625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jpd\" (UniqueName: \"kubernetes.io/projected/b4698fe3-a607-4978-bad4-5b83d3beb21b-kube-api-access-k2jpd\") pod \"catalog-operator-68c6474976-nsf5t\" (UID: \"b4698fe3-a607-4978-bad4-5b83d3beb21b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.706253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.710392 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cdrcc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.711479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.711897 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.211882215 +0000 UTC m=+218.669913756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.719660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.728067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.735263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqj4\" (UniqueName: \"kubernetes.io/projected/ee6682f1-2148-45a9-ac41-aeb6fddbabb4-kube-api-access-wvqj4\") pod \"dns-default-5v822\" (UID: \"ee6682f1-2148-45a9-ac41-aeb6fddbabb4\") " pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.740107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.740334 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e80a44_9bdc_4321_9536_8eba4527f181.slice/crio-e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b WatchSource:0}: Error finding container e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b: Status 404 returned error can't find the container with id e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.745661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0068e5c-7377-479d-9cc5-fd1270c74b33-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vwmr9\" (UID: \"d0068e5c-7377-479d-9cc5-fd1270c74b33\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.749079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.753677 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.763667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbkf\" (UniqueName: \"kubernetes.io/projected/7479d10c-1c3b-497e-8dda-07cd22aeccf0-kube-api-access-qtbkf\") pod \"service-ca-9c57cc56f-kgsw2\" (UID: \"7479d10c-1c3b-497e-8dda-07cd22aeccf0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.777332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jmdf\" (UniqueName: \"kubernetes.io/projected/661df377-ed57-4f75-9be9-3fc5f87cf37e-kube-api-access-4jmdf\") pod \"kube-storage-version-migrator-operator-b67b599dd-bq4hh\" (UID: \"661df377-ed57-4f75-9be9-3fc5f87cf37e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.778829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.797341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8v58t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.802143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-454wp" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.809762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587836f8-b700-43d0-940e-81d7820b2a6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9r828\" (UID: \"587836f8-b700-43d0-940e-81d7820b2a6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.812223 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.812648 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.312560829 +0000 UTC m=+218.770592380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.815789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.816284 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.316269272 +0000 UTC m=+218.774300813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.821005 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.825721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gdx4t" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.830037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"auto-csr-approver-29567120-j7789\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.840376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"collect-profiles-29567115-v58cr\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.878408 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.880449 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xzx7n"] Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.917557 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.917720 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.417696143 +0000 UTC m=+218.875727684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.917841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:14 crc kubenswrapper[4795]: E0320 17:21:14.918138 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.418126279 +0000 UTC m=+218.876157810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.931803 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.948260 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7074cf98_12f4_4a73_ad96_4959f64398a7.slice/crio-5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73 WatchSource:0}: Error finding container 5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73: Status 404 returned error can't find the container with id 5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73 Mar 20 17:21:14 crc kubenswrapper[4795]: I0320 17:21:14.953225 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:14 crc kubenswrapper[4795]: W0320 17:21:14.958143 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2ae844_da0b_45fd_b9f6_e780f4c5e3c0.slice/crio-370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55 WatchSource:0}: Error finding container 370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55: Status 404 returned error can't find the container with id 370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.010827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.019836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.020331 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.520298856 +0000 UTC m=+218.978330397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.038951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.055478 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.063850 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.073894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.095137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.121876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.122157 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.622146512 +0000 UTC m=+219.080178053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.138768 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5l8ml"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.161629 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.166981 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97wlq"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.222767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.223145 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.723129457 +0000 UTC m=+219.181160998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.325442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.325849 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.825833434 +0000 UTC m=+219.283864975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.335877 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gpx9r"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.368962 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.426349 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.426607 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:15.92658372 +0000 UTC m=+219.384615261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.517378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerStarted","Data":"54c16e287e6b044067d81a5f122f5fce8bd8b850064a731beff318d152b5a0e9"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.532015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.534999 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.034983882 +0000 UTC m=+219.493015423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.556300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" event={"ID":"109e018a-9ad5-40e6-bd49-07d49d718161","Type":"ContainerStarted","Data":"b708b1c701818d51fc3cb8fe9602af7235c07027a0302e2046f6aae6243a8521"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.559313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" event={"ID":"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc","Type":"ContainerStarted","Data":"18af66a2ad92bf329fba2a2f1838a1cccadcb24d62e75458f9f37c1f2a373e33"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.581490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" event={"ID":"c89cb82f-a141-419f-bf33-93c219c84e51","Type":"ContainerStarted","Data":"180c6dc025013638cc807fdf1c845f8d6cda91b8b6f98a2afe0514c07d4b9530"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.581534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" event={"ID":"c89cb82f-a141-419f-bf33-93c219c84e51","Type":"ContainerStarted","Data":"8579508d0e7b50d30b4b5e7989c16cb3aeefdcdf0d60a69b20c5745b4c3dc9da"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.583886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerStarted","Data":"f1055e0f897eb3de575665d60219e71c19392e8e5d91a217f1cf463d9d4c9c60"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.586030 4795 generic.go:334] "Generic (PLEG): container finished" podID="5c21571e-5513-46e0-9eed-4ec64df8e445" containerID="c9ed998f7b281a48e48683b77903f0a206a9d4a28e8a2a53e079b69fdbc3b983" exitCode=0 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.586092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerDied","Data":"c9ed998f7b281a48e48683b77903f0a206a9d4a28e8a2a53e079b69fdbc3b983"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.587270 4795 generic.go:334] "Generic (PLEG): container finished" podID="0415738e-f327-433a-9a28-0a991138e021" containerID="f68eb51d5c9397851fe5e4cd3a21388a07f0fc3771864a526fae4e81b30aca40" exitCode=0 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.587323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" event={"ID":"0415738e-f327-433a-9a28-0a991138e021","Type":"ContainerDied","Data":"f68eb51d5c9397851fe5e4cd3a21388a07f0fc3771864a526fae4e81b30aca40"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.588180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" event={"ID":"e6f292b8-878f-418e-8c85-2f7818e9dba1","Type":"ContainerStarted","Data":"7e929028c9976aa944a24cd775ab21d67ac03300461edb7f6203898cf7a1e075"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.592975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerStarted","Data":"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.593717 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nww6d container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.593761 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.593974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" event={"ID":"e5e80a44-9bdc-4321-9536-8eba4527f181","Type":"ContainerStarted","Data":"e896b711c3658be489a0bb06d1a95b5034bb3fc548185ed627dd3dd0684f441b"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.595479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lrxrs" event={"ID":"067aa008-8dda-4bfe-bfd2-388abdb54299","Type":"ContainerStarted","Data":"4e4517e84f9b723ec8f24685f10c6bbee574b560c64eb3923fca706613086039"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.598252 4795 generic.go:334] "Generic (PLEG): container finished" podID="2dbe21c7-d209-4259-b51d-b486b741e9c7" containerID="1733cd22fbd27185865760c89a7ab5cc50cefba540a0433cda64c9f1148b7ed5" exitCode=0 Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.598320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" event={"ID":"2dbe21c7-d209-4259-b51d-b486b741e9c7","Type":"ContainerDied","Data":"1733cd22fbd27185865760c89a7ab5cc50cefba540a0433cda64c9f1148b7ed5"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.601839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" event={"ID":"9f31b9ac-9447-4b20-ac60-7532edfa4600","Type":"ContainerStarted","Data":"2048fcd6da728d0a278446bafbcdb77ecde924f134d2b1a44b72a125b1721e3c"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.601890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" event={"ID":"9f31b9ac-9447-4b20-ac60-7532edfa4600","Type":"ContainerStarted","Data":"99d36198ddf70b000627cdbdb035489a54a189c71817814eeed4f8b69dc14386"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.608655 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" event={"ID":"bfa1fcc6-c9f8-4928-8a95-6c418323dd69","Type":"ContainerStarted","Data":"fa028b0d15db3d54fb873ef78829dfec8396a21c64139d721d4aa98c65b9c0a6"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.608711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" event={"ID":"bfa1fcc6-c9f8-4928-8a95-6c418323dd69","Type":"ContainerStarted","Data":"91f71ce7f513a7937de856fd70108bad56de6a942f3aad8f88712b5ae1b6e03d"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.617523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerStarted","Data":"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.618560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xzx7n" event={"ID":"7074cf98-12f4-4a73-ad96-4959f64398a7","Type":"ContainerStarted","Data":"5815c7abe698eb110c49411c695f47ad0e46e3c7cdd3062d360030c37119db73"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.625628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" event={"ID":"0589e639-75bf-4a26-a80b-dbb69a6c9955","Type":"ContainerStarted","Data":"76a82a643089fa9d447aac978b5c6e67418fb24ac358cc2b5b2b550cdda222c3"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.627392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-97wlq" event={"ID":"5c603995-8326-4bea-892a-74ee1e8c8dea","Type":"ContainerStarted","Data":"6c46a4db9827b197f840f3213bab1ae85a40953d78cbd3db3e409d8746c0b8fa"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.628041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" event={"ID":"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db","Type":"ContainerStarted","Data":"e223dc095c0838fe7665fd014cc949fa9ca5d604adfd1ffc0cadf593e6b69e7f"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.628555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" event={"ID":"32dc8fa2-0199-444e-9983-4af0fb9172b1","Type":"ContainerStarted","Data":"f781d3f94d993cd68303fcd18451ac14ed765b0b116e4ee7ddb6556d28d030fa"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.629054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8v58t" event={"ID":"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0","Type":"ContainerStarted","Data":"370dce15e6c84cc0a92e286c6f5fbcc24fb25aca471b6b19fac2c69c5d27bb55"} Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.637228 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.637995 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.137974758 +0000 UTC m=+219.596006289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.744048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.754591 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.254562516 +0000 UTC m=+219.712594057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.846631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.847279 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.347238211 +0000 UTC m=+219.805269762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.847938 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.848381 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.348369402 +0000 UTC m=+219.806400943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.903873 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.923685 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77"] Mar 20 17:21:15 crc kubenswrapper[4795]: I0320 17:21:15.952269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:15 crc kubenswrapper[4795]: E0320 17:21:15.952802 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.45277047 +0000 UTC m=+219.910802011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.054368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.054765 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.554752471 +0000 UTC m=+220.012784012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: W0320 17:21:16.090895 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd9b8a97_1b9d_4365_a985_a02d4078e3c2.slice/crio-e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681 WatchSource:0}: Error finding container e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681: Status 404 returned error can't find the container with id e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681 Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.154860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.155363 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.655349832 +0000 UTC m=+220.113381373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.224256 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podStartSLOduration=163.224234541 podStartE2EDuration="2m43.224234541s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.216352077 +0000 UTC m=+219.674383618" watchObservedRunningTime="2026-03-20 17:21:16.224234541 +0000 UTC m=+219.682266082" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.256779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.257136 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.757116614 +0000 UTC m=+220.215148155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.264194 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gdx4t"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.286570 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.294647 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.314049 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-454wp"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.316772 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5v822"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.358211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.358582 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.858566456 +0000 UTC m=+220.316597997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.407773 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.437316 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.483985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.484340 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:16.984315382 +0000 UTC m=+220.442346923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.528304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.529661 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.538140 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.538525 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4rrsb" podStartSLOduration=163.538509733 podStartE2EDuration="2m43.538509733s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.471770191 +0000 UTC m=+219.929801742" watchObservedRunningTime="2026-03-20 17:21:16.538509733 +0000 UTC m=+219.996541274" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.543812 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z5ls6" podStartSLOduration=163.543794123 podStartE2EDuration="2m43.543794123s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.526815012 +0000 UTC m=+219.984846553" watchObservedRunningTime="2026-03-20 17:21:16.543794123 +0000 UTC m=+220.001825664" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.556863 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.559328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c49vv"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.586510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.586815 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.08680075 +0000 UTC m=+220.544832291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.594422 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.597755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.601350 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgsw2"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.606638 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.632614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828"] Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.656075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lrxrs" event={"ID":"067aa008-8dda-4bfe-bfd2-388abdb54299","Type":"ContainerStarted","Data":"5074947363c8e078a6e0c94b853247cc510b7024fc9ef716b4a077bf949eef30"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.679039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hn4r8" podStartSLOduration=163.679025601 podStartE2EDuration="2m43.679025601s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.678355677 +0000 UTC m=+220.136387228" watchObservedRunningTime="2026-03-20 17:21:16.679025601 +0000 UTC m=+220.137057142" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.693849 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.694168 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.194157915 +0000 UTC m=+220.652189456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.713379 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-p5hmr" podStartSLOduration=163.713365666 podStartE2EDuration="2m43.713365666s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.712258587 +0000 UTC m=+220.170290138" watchObservedRunningTime="2026-03-20 17:21:16.713365666 +0000 UTC m=+220.171397207" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.743904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdx4t" event={"ID":"428476fd-a8f2-4ffc-bda6-f19da80778ac","Type":"ContainerStarted","Data":"a2ec43531ee102b81c16e77f9e33cd1c8bfc557b2e21e312edb7e2216cbd8413"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.774966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" event={"ID":"d2da6bf3-e17d-4adf-96dd-ea097cae192b","Type":"ContainerStarted","Data":"85d0bfe7d4eb237c6bde932b0f7cbca57c7cee54530fd3158bc9526fe9da9bfd"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.775127 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.795759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.797328 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.297311868 +0000 UTC m=+220.755343409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.802216 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gplds" podStartSLOduration=163.802196874 podStartE2EDuration="2m43.802196874s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.797189363 +0000 UTC m=+220.255220904" watchObservedRunningTime="2026-03-20 17:21:16.802196874 +0000 UTC m=+220.260228415" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.830577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" event={"ID":"0415738e-f327-433a-9a28-0a991138e021","Type":"ContainerStarted","Data":"b4ae167299192c5b7ea1edc08b3d4054f88f12c7bc58610329352460a168f0c8"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.849869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" event={"ID":"b4698fe3-a607-4978-bad4-5b83d3beb21b","Type":"ContainerStarted","Data":"844da625d6cd3258eb757238232b8cab634b2d2b87eb5b6a81bc94c3ddac17f8"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.873340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5v822" event={"ID":"ee6682f1-2148-45a9-ac41-aeb6fddbabb4","Type":"ContainerStarted","Data":"31266e8030183d9f823c01a5d24f31a879c15602d69fc31d02394e66ea006052"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.899396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:16 crc kubenswrapper[4795]: E0320 17:21:16.901463 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.401447716 +0000 UTC m=+220.859479257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.910803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerStarted","Data":"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.913537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.912966 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lrxrs" podStartSLOduration=163.91295042 podStartE2EDuration="2m43.91295042s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.910888076 +0000 UTC m=+220.368919617" watchObservedRunningTime="2026-03-20 17:21:16.91295042 +0000 UTC m=+220.370981961" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.916801 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mmtf7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" start-of-body= Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.916856 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.940836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerStarted","Data":"552c19deec1ee9883f89b895bd5a9ae748bbcb5e7537b45d9e966f6c5f189edb"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.944545 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" podStartSLOduration=163.944526097 podStartE2EDuration="2m43.944526097s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.937981531 +0000 UTC m=+220.396013072" watchObservedRunningTime="2026-03-20 17:21:16.944526097 +0000 UTC m=+220.402557638" Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.975151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerStarted","Data":"28f31a930970e41b0cef9f1365f8642907101991a92f9c69e9321574b4fa02fb"} Mar 20 17:21:16 crc kubenswrapper[4795]: I0320 17:21:16.985236 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.000967 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45976: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.035389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.037659 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" podStartSLOduration=164.037644128 podStartE2EDuration="2m44.037644128s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:16.991232378 +0000 UTC m=+220.449263909" watchObservedRunningTime="2026-03-20 17:21:17.037644128 +0000 UTC m=+220.495675669" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.043994 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.543970706 +0000 UTC m=+221.002002237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.048366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" event={"ID":"5611db8a-18df-426e-a6e7-7f6720da4109","Type":"ContainerStarted","Data":"d1be7ae7a18f045a6ad585588e79f66d1659499c01a7a3968ddeb8104aa71b6c"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.048443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" event={"ID":"5611db8a-18df-426e-a6e7-7f6720da4109","Type":"ContainerStarted","Data":"3c6528db53dd00c4095acc6b155873dcd4c99c1f85eab4c0a64d5c27b2f82a85"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.049885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.066545 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.066939 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-c2p9f container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.068420 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" podUID="5611db8a-18df-426e-a6e7-7f6720da4109" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.077223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" event={"ID":"ae2bd81e-16ce-4b5c-8667-85c9426a9221","Type":"ContainerStarted","Data":"656623000fd2b6c8f5c9b771fe74413e198fc2c20adee4fb3f7a29a06a8553ad"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.077291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" event={"ID":"ae2bd81e-16ce-4b5c-8667-85c9426a9221","Type":"ContainerStarted","Data":"0bcf03237f8b723d2e143b3ed0a5ebe114f436dae5359163a75c364d0874c5ca"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.093666 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" podStartSLOduration=164.088873532 podStartE2EDuration="2m44.088873532s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.087353798 +0000 UTC m=+220.545385339" watchObservedRunningTime="2026-03-20 17:21:17.088873532 +0000 UTC m=+220.546905073" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.116082 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" event={"ID":"a61cce87-0b4f-4886-a347-b98aecad272a","Type":"ContainerStarted","Data":"4aeaab9db3e68a9fcefb7e827f62ac75ed7dc12dd4c14a5a1d3770bea71ec19f"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.128017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" event={"ID":"ff31d5af-4eae-43e7-8512-c6f5d54501e1","Type":"ContainerStarted","Data":"cd8eb396cff6031b3c84ff18d9f8b15e72fe7f8c789adb6b655a9f2e39111bce"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.137595 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45990: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.137716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.138117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerStarted","Data":"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.138265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerStarted","Data":"10ac9aefe8ac1466c7fac8993e74ddbafb9c6821332b48f3d05657ff9290f6e5"} Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.138391 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.638376124 +0000 UTC m=+221.096407735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.139068 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.140607 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-clvzs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.140746 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.143784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-97wlq" event={"ID":"5c603995-8326-4bea-892a-74ee1e8c8dea","Type":"ContainerStarted","Data":"8ca2b6dc3b7309e97e3343c108c271295289b09bf2fdad4cda5a15905c7d3f5c"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.143833 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.147972 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-97wlq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.148029 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-97wlq" podUID="5c603995-8326-4bea-892a-74ee1e8c8dea" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.149216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"43dabd2f36a64796afc5bb69ac01cf23091d9567a626a9d8c7bcbcbd63e40a77"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.156546 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" event={"ID":"e5e80a44-9bdc-4321-9536-8eba4527f181","Type":"ContainerStarted","Data":"4f4163c43ad7e99e4fc450a246670ca3e89f4a7cd4e150b2fb7c320ffc818ac6"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.158933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" event={"ID":"32dc8fa2-0199-444e-9983-4af0fb9172b1","Type":"ContainerStarted","Data":"ed3d9634104e48b3e997a511558964c428b3525c8eb6eea89c978e4afba0014b"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.158974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" event={"ID":"32dc8fa2-0199-444e-9983-4af0fb9172b1","Type":"ContainerStarted","Data":"2302ec3f03a6f6e4554b4902e4e20d7ff92560402c2630351bfbe65e589c53c5"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.161941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" event={"ID":"109e018a-9ad5-40e6-bd49-07d49d718161","Type":"ContainerStarted","Data":"5eb1ff3d6226ec2921a970f4e455445453aaf29559587a532f5a5839e6bf53b6"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.161998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" event={"ID":"109e018a-9ad5-40e6-bd49-07d49d718161","Type":"ContainerStarted","Data":"69cff67a961c3b591feac8bcfe68ebb66c1041603648fbc42cf4c026019a25c9"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.168136 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podStartSLOduration=164.168119065 podStartE2EDuration="2m44.168119065s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.167099568 +0000 UTC m=+220.625131109" watchObservedRunningTime="2026-03-20 17:21:17.168119065 +0000 UTC m=+220.626150626" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.194263 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g5zjc" podStartSLOduration=164.194244926 podStartE2EDuration="2m44.194244926s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.193310492 +0000 UTC m=+220.651342033" watchObservedRunningTime="2026-03-20 17:21:17.194244926 +0000 UTC m=+220.652276467" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.195701 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45998: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.222138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerStarted","Data":"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.222871 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.231233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xzx7n" event={"ID":"7074cf98-12f4-4a73-ad96-4959f64398a7","Type":"ContainerStarted","Data":"ac28fbf851090a3f20657a132f33c64066426cdda1281462f4b02b532539f53a"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.231784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.238045 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.238089 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.238602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.239458 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.739444762 +0000 UTC m=+221.197476293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.252574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" event={"ID":"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb","Type":"ContainerStarted","Data":"078e38f1e68993a94da4fa6731635b71bf717db58878df9dddaaff4f76a78a28"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.263719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" event={"ID":"fe2960a0-9218-4d46-8c50-7285c5e27882","Type":"ContainerStarted","Data":"0b9b62e8bcb48fda415edd92c26eb068bd4c60e58a12ac8eaf98d8be4020ad2a"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.272602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" event={"ID":"7ff4d197-bbb9-45f9-9cdc-0c76079aa9db","Type":"ContainerStarted","Data":"2379257154e0054e98ee3a70fccefb094c64b7450dc6992a4ce9411ab4b33bae"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.277647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" event={"ID":"cd9b8a97-1b9d-4365-a985-a02d4078e3c2","Type":"ContainerStarted","Data":"88bf7f225a65d9066735c363a6af63172e793eb14b750c94f9cbb6bebc8f4cf6"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.277720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" event={"ID":"cd9b8a97-1b9d-4365-a985-a02d4078e3c2","Type":"ContainerStarted","Data":"e5b30a980a1ea8f000e0c5e966caf03a49d033f0c2a6d4a575291d2d8a645681"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.293154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" event={"ID":"2dbe21c7-d209-4259-b51d-b486b741e9c7","Type":"ContainerStarted","Data":"736fcacac3c55fdf35c00fdcfd8aef8acd88fcbbf9c2da52162e023dbf09f8b5"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.293312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.302094 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46004: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.304675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" event={"ID":"4ad2a1f4-aa7e-4f97-8ce6-0ae36dd5c9cc","Type":"ContainerStarted","Data":"a9db3ae2c35b309dccb2b4aad094b93af423521c890af0d8da9bafb6fc6d1a48"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.312588 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-cdrcc" podStartSLOduration=164.312574944 podStartE2EDuration="2m44.312574944s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.253006921 +0000 UTC m=+220.711038452" watchObservedRunningTime="2026-03-20 17:21:17.312574944 +0000 UTC m=+220.770606485" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.313759 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-45pjp" podStartSLOduration=164.313752937 podStartE2EDuration="2m44.313752937s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.312102927 +0000 UTC m=+220.770134468" watchObservedRunningTime="2026-03-20 17:21:17.313752937 +0000 UTC m=+220.771784478" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.329160 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" event={"ID":"0589e639-75bf-4a26-a80b-dbb69a6c9955","Type":"ContainerStarted","Data":"16bdca6831351fe8fda25820cdee23586fb9c32f72a6772ec3dae34fdfc35e77"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.339914 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-97wlq" podStartSLOduration=164.339897818 podStartE2EDuration="2m44.339897818s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.339565136 +0000 UTC m=+220.797596677" watchObservedRunningTime="2026-03-20 17:21:17.339897818 +0000 UTC m=+220.797929369" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.340363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.343095 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.843084502 +0000 UTC m=+221.301116043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.351770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8v58t" event={"ID":"9c2ae844-da0b-45fd-b9f6-e780f4c5e3c0","Type":"ContainerStarted","Data":"3f7fe5bf0227e70a4d4d0a8926c977f8e860e099cd32663a603e50c22d058872"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.363611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" event={"ID":"661df377-ed57-4f75-9be9-3fc5f87cf37e","Type":"ContainerStarted","Data":"137f21a34253c69141125a51a6740fb94fed150263dd468485ae82a0c92d589e"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.368546 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" event={"ID":"7dc57139-c6ad-4639-a09f-d07f8da49f4e","Type":"ContainerStarted","Data":"e5354570ff761cd4a55de7c9761b88e79e43092472e6aa3a0f82a5da49a6dcb2"} Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.381451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.412076 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46008: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.412882 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrpxl" podStartSLOduration=164.412864574 podStartE2EDuration="2m44.412864574s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.412261283 +0000 UTC m=+220.870292824" watchObservedRunningTime="2026-03-20 17:21:17.412864574 +0000 UTC m=+220.870896115" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.414821 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5l8ml" podStartSLOduration=164.414814705 podStartE2EDuration="2m44.414814705s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.383608161 +0000 UTC m=+220.841639702" watchObservedRunningTime="2026-03-20 17:21:17.414814705 +0000 UTC m=+220.872846246" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.445133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.446293 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:17.946274396 +0000 UTC m=+221.404305937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.450466 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mrhs2" podStartSLOduration=164.450452917 podStartE2EDuration="2m44.450452917s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.444541954 +0000 UTC m=+220.902573495" watchObservedRunningTime="2026-03-20 17:21:17.450452917 +0000 UTC m=+220.908484458" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.474958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" podStartSLOduration=164.474944139 podStartE2EDuration="2m44.474944139s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.474594326 +0000 UTC m=+220.932625867" watchObservedRunningTime="2026-03-20 17:21:17.474944139 +0000 UTC m=+220.932975680" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.516734 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46012: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.526939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xzx7n" podStartSLOduration=164.52692658 podStartE2EDuration="2m44.52692658s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.499205082 +0000 UTC m=+220.957236623" watchObservedRunningTime="2026-03-20 17:21:17.52692658 +0000 UTC m=+220.984958121" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.547245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.549267 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.049247524 +0000 UTC m=+221.507279065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.569957 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" podStartSLOduration=164.569940448 podStartE2EDuration="2m44.569940448s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.541923819 +0000 UTC m=+220.999955360" watchObservedRunningTime="2026-03-20 17:21:17.569940448 +0000 UTC m=+221.027971989" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.625603 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-knn77" podStartSLOduration=164.625581981 podStartE2EDuration="2m44.625581981s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.566282077 +0000 UTC m=+221.024313618" watchObservedRunningTime="2026-03-20 17:21:17.625581981 +0000 UTC m=+221.083613522" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.626940 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46022: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.627932 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.639006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.649809 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.650819 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.150799198 +0000 UTC m=+221.608830729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.658811 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8v58t" podStartSLOduration=6.658792077 podStartE2EDuration="6.658792077s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:17.627235801 +0000 UTC m=+221.085267342" watchObservedRunningTime="2026-03-20 17:21:17.658792077 +0000 UTC m=+221.116823618" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.659164 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:17 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:17 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:17 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.659207 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.755191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.755671 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.255639172 +0000 UTC m=+221.713670713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.795673 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46028: no serving certificate available for the kubelet" Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.858566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.858950 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.3589377 +0000 UTC m=+221.816969241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:17 crc kubenswrapper[4795]: I0320 17:21:17.960470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:17 crc kubenswrapper[4795]: E0320 17:21:17.960873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.460859859 +0000 UTC m=+221.918891400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.061217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.061846 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.561829864 +0000 UTC m=+222.019861405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.171360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.171744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.671727739 +0000 UTC m=+222.129759280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.272146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.272450 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.772435853 +0000 UTC m=+222.230467394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.376215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.376501 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.876487169 +0000 UTC m=+222.334518710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.396764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5v822" event={"ID":"ee6682f1-2148-45a9-ac41-aeb6fddbabb4","Type":"ContainerStarted","Data":"8ec532c8079cd62d4c26a9733e32c1b12deb61ee64035e6e8fa18edbb30cca2f"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.474209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" event={"ID":"5c21571e-5513-46e0-9eed-4ec64df8e445","Type":"ContainerStarted","Data":"224d79e5dc4ec71d76223795bb5cbedf4513345b2722860bb4471a4dc08f027f"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.477210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.477508 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:18.977490094 +0000 UTC m=+222.435521635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.488530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" event={"ID":"9fb6b65a-69af-42cc-a4f7-f3e2ad4b98cb","Type":"ContainerStarted","Data":"12b36845a5a7d228e3e748fe6a0fda2bb004ced1ad6fb38879b59fe5fb0b0c33"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.505878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" event={"ID":"fe2960a0-9218-4d46-8c50-7285c5e27882","Type":"ContainerStarted","Data":"7147945169d288d4147169b27adce639e5218dbe624dadf2e19de8e5cd1f7b88"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.512492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gdx4t" event={"ID":"428476fd-a8f2-4ffc-bda6-f19da80778ac","Type":"ContainerStarted","Data":"bb7f33248b0e1fb6f93959335725dcf46d2ff4ab075cdb3f56b0e0fe7a1bf85a"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.521527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" event={"ID":"ff31d5af-4eae-43e7-8512-c6f5d54501e1","Type":"ContainerStarted","Data":"05423c9183d5036ef1a1c4ae8a8e678057166b475a68c1fbe1b7635f827ee892"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.528591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerStarted","Data":"1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.537440 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46030: no serving certificate available for the kubelet" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.546666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" event={"ID":"b4698fe3-a607-4978-bad4-5b83d3beb21b","Type":"ContainerStarted","Data":"1e232a99c6bb86fa55b55610d3a0d5aea787c9dd57ea0519637c851d1b2fa337"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.546982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.548045 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nsf5t container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.548081 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" podUID="b4698fe3-a607-4978-bad4-5b83d3beb21b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.569859 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" event={"ID":"d2da6bf3-e17d-4adf-96dd-ea097cae192b","Type":"ContainerStarted","Data":"94fe864c74af53577f10e39d8b02a6f9d851e4064fdc194056dfbdf5c473e26b"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.570575 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" podStartSLOduration=165.570532564 podStartE2EDuration="2m45.570532564s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.537510705 +0000 UTC m=+221.995542246" watchObservedRunningTime="2026-03-20 17:21:18.570532564 +0000 UTC m=+222.028564105" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.573351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" event={"ID":"d0068e5c-7377-479d-9cc5-fd1270c74b33","Type":"ContainerStarted","Data":"a9ba1ae8884b90fea3faf4a17e7dc6dc873d005658b2a2b343bca6dd728d748c"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.578570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.579965 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.079951133 +0000 UTC m=+222.537982674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.580676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" event={"ID":"a61cce87-0b4f-4886-a347-b98aecad272a","Type":"ContainerStarted","Data":"47ba2a6c1eab62bab6e402e6b9a48c862df366bee01594b1a8522c5fb95bbd91"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.580727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" event={"ID":"a61cce87-0b4f-4886-a347-b98aecad272a","Type":"ContainerStarted","Data":"ef1bd3cd1a6f7f35a53192afe65bc066ef3ea860125dcb3c070783cdd45e1106"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.581135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.590405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" event={"ID":"d1a9c8a4-d7c9-4365-8516-465b89c76ea8","Type":"ContainerStarted","Data":"064ccc2cb0d574f51c8f36bda50c4b3467372412b341b19c96ca0545c2cb6b83"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.590721 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.596840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" event={"ID":"587836f8-b700-43d0-940e-81d7820b2a6b","Type":"ContainerStarted","Data":"d9ccefe373a7c66ab0fc517c77194971cbd3f11fab9855fec9dece4ab1ffb741"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.598943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" event={"ID":"ae2bd81e-16ce-4b5c-8667-85c9426a9221","Type":"ContainerStarted","Data":"6bfcd69fde40a5a7065de6480195d1ea02ec281fd4ccf0bf301f7a53857fc995"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.605305 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wwjgw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.605342 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" podUID="d1a9c8a4-d7c9-4365-8516-465b89c76ea8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.608798 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c49vv" podStartSLOduration=165.60878261 podStartE2EDuration="2m45.60878261s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.608207639 +0000 UTC m=+222.066239180" watchObservedRunningTime="2026-03-20 17:21:18.60878261 +0000 UTC m=+222.066814141" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.609612 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" podStartSLOduration=165.60960625 podStartE2EDuration="2m45.60960625s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.587580037 +0000 UTC m=+222.045611578" watchObservedRunningTime="2026-03-20 17:21:18.60960625 +0000 UTC m=+222.067637791" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.620529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" event={"ID":"7479d10c-1c3b-497e-8dda-07cd22aeccf0","Type":"ContainerStarted","Data":"8b6642f8b3cb955eddfd4a9ff9e184d0f2736818200c0c1b8f9d9e4799c51d01"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.625666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-j7789" event={"ID":"bed1d31b-b060-45c3-95bf-3b226a36efe1","Type":"ContainerStarted","Data":"dd9d5f9731ec60032210cdc180eb41d5f236e29f6e6729daa332365615c09023"} Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.625826 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" containerID="cri-o://2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" gracePeriod=30 Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.627238 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-clvzs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.627269 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.628361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gdx4t" podStartSLOduration=7.628341644 podStartE2EDuration="7.628341644s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.627080829 +0000 UTC m=+222.085112370" watchObservedRunningTime="2026-03-20 17:21:18.628341644 +0000 UTC m=+222.086373185" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.629260 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" containerID="cri-o://113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" gracePeriod=30 Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.632352 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.632406 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.655850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-97wlq" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.657026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.659994 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:18 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:18 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:18 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.660047 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.682735 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" podStartSLOduration=165.682718812 podStartE2EDuration="2m45.682718812s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.68185816 +0000 UTC m=+222.139889701" watchObservedRunningTime="2026-03-20 17:21:18.682718812 +0000 UTC m=+222.140750353" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.683980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.685594 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.185566034 +0000 UTC m=+222.643597575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.741981 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.742506 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.744904 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" podStartSLOduration=165.74488996 podStartE2EDuration="2m45.74488996s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.700623086 +0000 UTC m=+222.158654627" watchObservedRunningTime="2026-03-20 17:21:18.74488996 +0000 UTC m=+222.202921501" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.745103 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" podStartSLOduration=165.745099307 podStartE2EDuration="2m45.745099307s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.742179392 +0000 UTC m=+222.200210933" watchObservedRunningTime="2026-03-20 17:21:18.745099307 +0000 UTC m=+222.203130848" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.774475 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" podStartSLOduration=165.774451183 podStartE2EDuration="2m45.774451183s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.773637784 +0000 UTC m=+222.231669325" watchObservedRunningTime="2026-03-20 17:21:18.774451183 +0000 UTC m=+222.232482724" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.786942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.802326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" podStartSLOduration=165.802304026 podStartE2EDuration="2m45.802304026s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.791415904 +0000 UTC m=+222.249447445" watchObservedRunningTime="2026-03-20 17:21:18.802304026 +0000 UTC m=+222.260335567" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.818731 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.318716076 +0000 UTC m=+222.776747617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.821144 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gpx9r" podStartSLOduration=165.821122153 podStartE2EDuration="2m45.821122153s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.817474372 +0000 UTC m=+222.275505913" watchObservedRunningTime="2026-03-20 17:21:18.821122153 +0000 UTC m=+222.279153694" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.891191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.891516 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.391500926 +0000 UTC m=+222.849532467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.899112 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" podStartSLOduration=165.89909776 podStartE2EDuration="2m45.89909776s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:18.864253796 +0000 UTC m=+222.322285337" watchObservedRunningTime="2026-03-20 17:21:18.89909776 +0000 UTC m=+222.357129301" Mar 20 17:21:18 crc kubenswrapper[4795]: I0320 17:21:18.996753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:18 crc kubenswrapper[4795]: E0320 17:21:18.997297 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.497286484 +0000 UTC m=+222.955318025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.013814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.013864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.035848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.100057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.100370 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.600354904 +0000 UTC m=+223.058386445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.203262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.203609 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.70359823 +0000 UTC m=+223.161629771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.217774 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.270577 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.270843 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.270857 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.270959 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" containerName="route-controller-manager" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.271330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.296735 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305426 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.305498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") pod \"9b31867d-2f52-4f4c-943a-9431cb585027\" (UID: \"9b31867d-2f52-4f4c-943a-9431cb585027\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.307342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.307424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config" (OuterVolumeSpecName: "config") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.307519 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.8075036 +0000 UTC m=+223.265535141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.317367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.320034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm" (OuterVolumeSpecName: "kube-api-access-hpnfm") pod "9b31867d-2f52-4f4c-943a-9431cb585027" (UID: "9b31867d-2f52-4f4c-943a-9431cb585027"). InnerVolumeSpecName "kube-api-access-hpnfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.389650 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c2p9f" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.402102 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.406989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407186 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpnfm\" (UniqueName: \"kubernetes.io/projected/9b31867d-2f52-4f4c-943a-9431cb585027-kube-api-access-hpnfm\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407196 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407204 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b31867d-2f52-4f4c-943a-9431cb585027-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.407213 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b31867d-2f52-4f4c-943a-9431cb585027-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.407443 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:19.907433267 +0000 UTC m=+223.365464808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508485 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.508566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") pod \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\" (UID: \"ed2729b3-6b5a-4ae7-bad5-699c95dab85f\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.508873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.008848428 +0000 UTC m=+223.466879969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509226 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config" (OuterVolumeSpecName: "config") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509308 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.509367 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.009354716 +0000 UTC m=+223.467386257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509435 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.509454 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.510344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.510617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.522035 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn" (OuterVolumeSpecName: "kube-api-access-g5lkn") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "kube-api-access-g5lkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.523859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed2729b3-6b5a-4ae7-bad5-699c95dab85f" (UID: "ed2729b3-6b5a-4ae7-bad5-699c95dab85f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.525236 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.538303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"route-controller-manager-7fb98dc7f7-g49lt\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.611464 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.11143826 +0000 UTC m=+223.569469801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611759 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5lkn\" (UniqueName: \"kubernetes.io/projected/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-kube-api-access-g5lkn\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611793 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.611802 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ed2729b3-6b5a-4ae7-bad5-699c95dab85f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.612025 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.112013501 +0000 UTC m=+223.570045042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.638198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" event={"ID":"d1a9c8a4-d7c9-4365-8516-465b89c76ea8","Type":"ContainerStarted","Data":"99be79088ddfd7f80669aea1143f116e4bd5db832adf0b396f6b68c444c9f3fa"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.639151 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wwjgw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.639178 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" podUID="d1a9c8a4-d7c9-4365-8516-465b89c76ea8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641001 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b31867d-2f52-4f4c-943a-9431cb585027" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" exitCode=0 Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerDied","Data":"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641052 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" event={"ID":"9b31867d-2f52-4f4c-943a-9431cb585027","Type":"ContainerDied","Data":"f1055e0f897eb3de575665d60219e71c19392e8e5d91a217f1cf463d9d4c9c60"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641066 4795 scope.go:117] "RemoveContainer" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.641143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.648463 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:19 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:19 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:19 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.648498 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.654908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" event={"ID":"ff31d5af-4eae-43e7-8512-c6f5d54501e1","Type":"ContainerStarted","Data":"c8a620a751d9e8694b7daa37279282b0fbf7008de6092ba237774e8743691e9d"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.670916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" event={"ID":"661df377-ed57-4f75-9be9-3fc5f87cf37e","Type":"ContainerStarted","Data":"4ba37ac78d373afad371c1ff6afb94705e84340687cde353bc19394d5ef469a9"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.671527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.687178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" event={"ID":"7dc57139-c6ad-4639-a09f-d07f8da49f4e","Type":"ContainerStarted","Data":"a437a01737b0fc2abcb65b81e5839d8a14fb18d1f8280765b38578848b11cbf4"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.688516 4795 scope.go:117] "RemoveContainer" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.697931 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-q9zkh" podStartSLOduration=166.697916422 podStartE2EDuration="2m46.697916422s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.677745476 +0000 UTC m=+223.135777017" watchObservedRunningTime="2026-03-20 17:21:19.697916422 +0000 UTC m=+223.155947963" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.699131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.701769 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g9lr4"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.713515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.714510 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.214489949 +0000 UTC m=+223.672521490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.717897 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de\": container with ID starting with 113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de not found: ID does not exist" containerID="113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.717949 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de"} err="failed to get container status \"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de\": rpc error: code = NotFound desc = could not find container \"113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de\": container with ID starting with 113e3c01a06944cc9fd3d1734f692b9f362a40d4f2bd01b8b8565b447aa643de not found: ID does not exist" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728454 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" exitCode=0 Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerDied","Data":"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" event={"ID":"ed2729b3-6b5a-4ae7-bad5-699c95dab85f","Type":"ContainerDied","Data":"8dc54184aa3235d402105c8c10b4efe9d346fbed9538a68e181e2046d4944844"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728597 4795 scope.go:117] "RemoveContainer" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.728714 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nww6d" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.740993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kgsw2" event={"ID":"7479d10c-1c3b-497e-8dda-07cd22aeccf0","Type":"ContainerStarted","Data":"8b647c9a4a8522117ddf86614575a4051c8a80c1f8a50361724bf5772af1c8e4"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.758746 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pckpf" podStartSLOduration=166.758729091 podStartE2EDuration="2m46.758729091s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.758194723 +0000 UTC m=+223.216226254" watchObservedRunningTime="2026-03-20 17:21:19.758729091 +0000 UTC m=+223.216760632" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.760901 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bq4hh" podStartSLOduration=166.76089195 podStartE2EDuration="2m46.76089195s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.732973035 +0000 UTC m=+223.191004576" watchObservedRunningTime="2026-03-20 17:21:19.76089195 +0000 UTC m=+223.218923491" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.769816 4795 scope.go:117] "RemoveContainer" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.770201 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5v822" event={"ID":"ee6682f1-2148-45a9-ac41-aeb6fddbabb4","Type":"ContainerStarted","Data":"1c7765a0d8468d27ef8c9fafd096ec42e11be0366fc85bb4abb6ca8b628ebadc"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.770347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.772510 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.774234 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58\": container with ID starting with 2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58 not found: ID does not exist" containerID="2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.774299 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58"} err="failed to get container status \"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58\": rpc error: code = NotFound desc = could not find container \"2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58\": container with ID starting with 2f56f55936c1aefe33371c8c850392175779121b08e55d63ab6eff1d2df43f58 not found: ID does not exist" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.778837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"254ef3aa6a62fa55712408eb37e7db7600f3e1dcb0583b6377071776549c08f7"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.781905 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nww6d"] Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.788907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8jwc2" event={"ID":"d2da6bf3-e17d-4adf-96dd-ea097cae192b","Type":"ContainerStarted","Data":"65f8d4a1b294283aea2b3503643700af7592aef35b45b1492d252b985db0e342"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.811750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9r828" event={"ID":"587836f8-b700-43d0-940e-81d7820b2a6b","Type":"ContainerStarted","Data":"0bc6665630ff72f21309e606f4ce11e1137f70eb53ab17fe54ad2f4b5682fcc8"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.815720 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5v822" podStartSLOduration=8.815709602 podStartE2EDuration="8.815709602s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.812257868 +0000 UTC m=+223.270289409" watchObservedRunningTime="2026-03-20 17:21:19.815709602 +0000 UTC m=+223.273741143" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.834135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" event={"ID":"fe2960a0-9218-4d46-8c50-7285c5e27882","Type":"ContainerStarted","Data":"006935bafc32330ec5f80ed2d0a4cb0d7be3eba372e67f9e4f9eb21a52e28ab6"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.841319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.843052 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.343039016 +0000 UTC m=+223.801070557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.846020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" event={"ID":"d0068e5c-7377-479d-9cc5-fd1270c74b33","Type":"ContainerStarted","Data":"9a18b4cf9d78c62f682f6919ed686ea00222f8e54f354caca0d82613fc23a31f"} Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.855761 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.855820 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.856822 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nsf5t" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.862273 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p42nc" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.870762 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46044: no serving certificate available for the kubelet" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.917197 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kzt8c" podStartSLOduration=166.917178665 podStartE2EDuration="2m46.917178665s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.866083966 +0000 UTC m=+223.324115507" watchObservedRunningTime="2026-03-20 17:21:19.917178665 +0000 UTC m=+223.375210206" Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.947184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:19 crc kubenswrapper[4795]: E0320 17:21:19.948632 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.448611437 +0000 UTC m=+223.906642978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:19 crc kubenswrapper[4795]: I0320 17:21:19.991327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vwmr9" podStartSLOduration=166.991309693 podStartE2EDuration="2m46.991309693s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:19.991082445 +0000 UTC m=+223.449113986" watchObservedRunningTime="2026-03-20 17:21:19.991309693 +0000 UTC m=+223.449341234" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.007984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9xgc" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.049633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.049971 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.549959514 +0000 UTC m=+224.007991055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.104389 4795 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bl2bp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]log ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]etcd ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/max-in-flight-filter ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 17:21:20 crc kubenswrapper[4795]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 17:21:20 crc kubenswrapper[4795]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 17:21:20 crc kubenswrapper[4795]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 17:21:20 crc kubenswrapper[4795]: livez check failed Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.104449 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" podUID="5c21571e-5513-46e0-9eed-4ec64df8e445" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.151166 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.152283 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.652258016 +0000 UTC m=+224.110289557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.220566 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.256510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.257650 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.757635489 +0000 UTC m=+224.215667030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.357362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.357742 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.857714782 +0000 UTC m=+224.315746333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.462260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.462625 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:20.962614187 +0000 UTC m=+224.420645728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.563644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.563922 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.063895253 +0000 UTC m=+224.521926794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.564060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.564338 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.064327019 +0000 UTC m=+224.522358560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.643069 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:20 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:20 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.643196 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.664682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.664820 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.164794725 +0000 UTC m=+224.622826266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.664956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.665257 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.165241731 +0000 UTC m=+224.623273272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.765678 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.765823 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.26579853 +0000 UTC m=+224.723830061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.766095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.766407 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.266398342 +0000 UTC m=+224.724429883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.780478 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.780749 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.780767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.780876 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" containerName="controller-manager" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.781781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.786126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.796555 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.806854 4795 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.863660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"e2a08c6c0616c88e3639564a54bd192554c1ae0d5f874d76d44021f4ba46ce86"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.863721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"0813db22d2e9f2c49d4516839f56a75d5403aaebf57bb7fec6cb3b21729f3ef8"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.866592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerStarted","Data":"fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.866668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerStarted","Data":"6092fd1860a57c1ca6f62820e564b6ec08d02b8c7829faf4d750183f7837476f"} Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.866832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.867128 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.367108147 +0000 UTC m=+224.825139698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.867676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.868034 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.368019159 +0000 UTC m=+224.826050700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.877509 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wwjgw" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.892708 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" podStartSLOduration=3.892677576 podStartE2EDuration="3.892677576s" podCreationTimestamp="2026-03-20 17:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:20.889926888 +0000 UTC m=+224.347958429" watchObservedRunningTime="2026-03-20 17:21:20.892677576 +0000 UTC m=+224.350709117" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.969155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.969844 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.970166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.970221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:20 crc kubenswrapper[4795]: E0320 17:21:20.970311 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.470297421 +0000 UTC m=+224.928328962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.983191 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.985125 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:20 crc kubenswrapper[4795]: I0320 17:21:20.987935 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.004483 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.072355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.072601 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.572590333 +0000 UTC m=+225.030621874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.073457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.073520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.094435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"community-operators-kk5rk\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.103711 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.173217 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.673190064 +0000 UTC m=+225.131221605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173331 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.173868 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.673861668 +0000 UTC m=+225.131893209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.173907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.175987 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.176941 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.190249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"certified-operators-kzvch\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.198863 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.259037 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b31867d-2f52-4f4c-943a-9431cb585027" path="/var/lib/kubelet/pods/9b31867d-2f52-4f4c-943a-9431cb585027/volumes" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.261219 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2729b3-6b5a-4ae7-bad5-699c95dab85f" path="/var/lib/kubelet/pods/ed2729b3-6b5a-4ae7-bad5-699c95dab85f/volumes" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.274490 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.774459119 +0000 UTC m=+225.232490720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.274935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.275020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.275300 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.775291759 +0000 UTC m=+225.233323370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.302554 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.374157 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.375953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.376030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.376106 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.876079047 +0000 UTC m=+225.334110588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.376396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.376438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.383196 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.395448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"community-operators-hv8kd\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.480383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: E0320 17:21:21.480672 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:21:21.98066228 +0000 UTC m=+225.438693821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6rj8" (UID: "3dde633a-aefe-4c9b-84a7-301279016583") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.489998 4795 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T17:21:20.806878929Z","Handler":null,"Name":""} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.494779 4795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.494811 4795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.529366 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.581805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.582222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.582325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.599956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"certified-operators-l6vnf\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.604016 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.633955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.641988 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:21 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:21 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:21 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.642045 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.683919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.689394 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.689422 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.722352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6rj8\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.725974 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:21:21 crc kubenswrapper[4795]: W0320 17:21:21.729243 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b4d98b5_0434_4a84_b890_d2428de998b7.slice/crio-2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9 WatchSource:0}: Error finding container 2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9: Status 404 returned error can't find the container with id 2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9 Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.743232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.754658 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.816767 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.879127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-454wp" event={"ID":"489757b2-0de4-4275-8931-daa5c3b4a75a","Type":"ContainerStarted","Data":"8fffbb3c7f2eb6f9ce46db083a84a057678e59722980a30f14937a525bbf5adc"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.894184 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.894795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerStarted","Data":"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerStarted","Data":"2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902442 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902768 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.902969 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.905425 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.909770 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.910442 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.915181 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.921030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-454wp" podStartSLOduration=10.921006361 podStartE2EDuration="10.921006361s" podCreationTimestamp="2026-03-20 17:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:21.912734533 +0000 UTC m=+225.370766094" watchObservedRunningTime="2026-03-20 17:21:21.921006361 +0000 UTC m=+225.379037902" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.926010 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" exitCode=0 Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.926098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.926123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerStarted","Data":"352f21e959b8a9617f62fdaa474337c620b65ea35de203e2a6258d4f6ab66557"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.940419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerStarted","Data":"333adeb9b81abd47208fc6ec71e454bad1f18be9356efa101b49dd2d5983cc19"} Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.941564 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.953888 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.976798 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: I0320 17:21:21.990295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:21 crc kubenswrapper[4795]: W0320 17:21:21.994538 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3b1055_857d_4334_b39a_24b0ac9139d1.slice/crio-1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3 WatchSource:0}: Error finding container 1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3: Status 404 returned error can't find the container with id 1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.055477 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091390 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.091466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.097971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.098549 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.099904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.114763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.125954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"controller-manager-bbf9678f8-ftr7c\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.242825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.452723 4795 ???:1] "http: TLS handshake error from 192.168.126.11:46046: no serving certificate available for the kubelet" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.456179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:22 crc kubenswrapper[4795]: W0320 17:21:22.464386 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5317d308_31fb_4863_bf91_5ba6a632ba67.slice/crio-27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570 WatchSource:0}: Error finding container 27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570: Status 404 returned error can't find the container with id 27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.643890 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:22 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:22 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:22 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.643943 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.771243 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.772453 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.775051 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.784736 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.902613 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.902716 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.902769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.957580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerStarted","Data":"9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.957631 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerStarted","Data":"27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.957648 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.963330 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerID="82d8d3d7e1e3eb80a041eea63e969e0d1aa9af7af1cccc7c6c9f3460b4809935" exitCode=0 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.963368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"82d8d3d7e1e3eb80a041eea63e969e0d1aa9af7af1cccc7c6c9f3460b4809935"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.963414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerStarted","Data":"1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.966153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerStarted","Data":"76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.966189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerStarted","Data":"3db277134197ff5142f4f0d85c126502b98d3bb29670b6c4409e582bcdf40d86"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.966293 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.969036 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" exitCode=0 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.969073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.972174 4795 generic.go:334] "Generic (PLEG): container finished" podID="57849322-f280-42ee-a330-18120aeed5db" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" exitCode=0 Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.972225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1"} Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.975140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:22 crc kubenswrapper[4795]: I0320 17:21:22.985596 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" podStartSLOduration=5.985575499 podStartE2EDuration="5.985575499s" podCreationTimestamp="2026-03-20 17:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:22.978864637 +0000 UTC m=+226.436896198" watchObservedRunningTime="2026-03-20 17:21:22.985575499 +0000 UTC m=+226.443607040" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.004778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.004903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.005042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.005310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.005406 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.025067 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"redhat-marketplace-ht4zv\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.133978 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.174717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" podStartSLOduration=170.174699656 podStartE2EDuration="2m50.174699656s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:23.093883477 +0000 UTC m=+226.551915018" watchObservedRunningTime="2026-03-20 17:21:23.174699656 +0000 UTC m=+226.632731197" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.175962 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.176900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.193106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.289433 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.325991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.326083 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.326145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.427468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.427537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.427601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.428065 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.428337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.452553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"redhat-marketplace-x4492\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.522261 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.579522 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:21:23 crc kubenswrapper[4795]: W0320 17:21:23.586880 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70000016_e928_4b11_a31d_4d08e9450a1c.slice/crio-37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257 WatchSource:0}: Error finding container 37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257: Status 404 returned error can't find the container with id 37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257 Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.641894 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:23 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:23 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:23 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.641955 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.761670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.773372 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bl2bp" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.792419 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.792903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.797755 4795 patch_prober.go:28] interesting pod/console-f9d7485db-hn4r8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.797799 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hn4r8" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.900915 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.916783 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.917638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.920001 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.920464 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.921823 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.991874 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:21:23 crc kubenswrapper[4795]: I0320 17:21:23.996720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:23.999903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerStarted","Data":"0bf34893ceb2a123dbae4a13fdf9053d4d9c1472bbfe52b966a8795f5fc54346"} Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.000899 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.001797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.006500 4795 generic.go:334] "Generic (PLEG): container finished" podID="70000016-e928-4b11-a31d-4d08e9450a1c" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" exitCode=0 Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.006855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25"} Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.006888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerStarted","Data":"37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257"} Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.043943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.044072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.144893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145491 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.145614 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.155271 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.188548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.246736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.246823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.246904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.247505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.247661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.265057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"redhat-operators-q7czt\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.275550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.322563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.375149 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.376147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.383120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.452324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.452388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.452416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.484925 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.484966 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-xzx7n container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.484986 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.485024 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xzx7n" podUID="7074cf98-12f4-4a73-ad96-4959f64398a7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.553696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.553739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.553755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.554424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.554635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.584964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"redhat-operators-2m5ld\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.632217 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.638768 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.641386 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:21:24 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 20 17:21:24 crc kubenswrapper[4795]: [+]process-running ok Mar 20 17:21:24 crc kubenswrapper[4795]: healthz check failed Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.641432 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:21:24 crc kubenswrapper[4795]: I0320 17:21:24.749130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.013308 4795 generic.go:334] "Generic (PLEG): container finished" podID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" exitCode=0 Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.013391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf"} Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.640950 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.643200 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lrxrs" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.713040 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.713637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.715269 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.715880 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.730895 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.881611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.881748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.983592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.983648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:25 crc kubenswrapper[4795]: I0320 17:21:25.983753 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.003085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.021095 4795 generic.go:334] "Generic (PLEG): container finished" podID="918aa57e-8c94-4427-b6bd-218a5687d684" containerID="1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c" exitCode=0 Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.021156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerDied","Data":"1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c"} Mar 20 17:21:26 crc kubenswrapper[4795]: I0320 17:21:26.043145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:27 crc kubenswrapper[4795]: I0320 17:21:27.196604 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45376: no serving certificate available for the kubelet" Mar 20 17:21:27 crc kubenswrapper[4795]: I0320 17:21:27.605194 4795 ???:1] "http: TLS handshake error from 192.168.126.11:45386: no serving certificate available for the kubelet" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.462525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.645364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") pod \"918aa57e-8c94-4427-b6bd-218a5687d684\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.645413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") pod \"918aa57e-8c94-4427-b6bd-218a5687d684\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.646720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") pod \"918aa57e-8c94-4427-b6bd-218a5687d684\" (UID: \"918aa57e-8c94-4427-b6bd-218a5687d684\") " Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.647673 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume" (OuterVolumeSpecName: "config-volume") pod "918aa57e-8c94-4427-b6bd-218a5687d684" (UID: "918aa57e-8c94-4427-b6bd-218a5687d684"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.656086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "918aa57e-8c94-4427-b6bd-218a5687d684" (UID: "918aa57e-8c94-4427-b6bd-218a5687d684"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.656713 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h" (OuterVolumeSpecName: "kube-api-access-vzk8h") pod "918aa57e-8c94-4427-b6bd-218a5687d684" (UID: "918aa57e-8c94-4427-b6bd-218a5687d684"). InnerVolumeSpecName "kube-api-access-vzk8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.748029 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/918aa57e-8c94-4427-b6bd-218a5687d684-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.748062 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzk8h\" (UniqueName: \"kubernetes.io/projected/918aa57e-8c94-4427-b6bd-218a5687d684-kube-api-access-vzk8h\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.748073 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/918aa57e-8c94-4427-b6bd-218a5687d684-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:29 crc kubenswrapper[4795]: I0320 17:21:29.824572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5v822" Mar 20 17:21:30 crc kubenswrapper[4795]: I0320 17:21:30.048424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" event={"ID":"918aa57e-8c94-4427-b6bd-218a5687d684","Type":"ContainerDied","Data":"552c19deec1ee9883f89b895bd5a9ae748bbcb5e7537b45d9e966f6c5f189edb"} Mar 20 17:21:30 crc kubenswrapper[4795]: I0320 17:21:30.048472 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552c19deec1ee9883f89b895bd5a9ae748bbcb5e7537b45d9e966f6c5f189edb" Mar 20 17:21:30 crc kubenswrapper[4795]: I0320 17:21:30.048479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr" Mar 20 17:21:32 crc kubenswrapper[4795]: I0320 17:21:32.191265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:32 crc kubenswrapper[4795]: I0320 17:21:32.197060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77-metrics-certs\") pod \"network-metrics-daemon-jpp4c\" (UID: \"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77\") " pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:32 crc kubenswrapper[4795]: I0320 17:21:32.378408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpp4c" Mar 20 17:21:33 crc kubenswrapper[4795]: I0320 17:21:33.791534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:33 crc kubenswrapper[4795]: I0320 17:21:33.795202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:21:34 crc kubenswrapper[4795]: I0320 17:21:34.491166 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xzx7n" Mar 20 17:21:34 crc kubenswrapper[4795]: I0320 17:21:34.491521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:21:35 crc kubenswrapper[4795]: W0320 17:21:35.195904 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73dd05f7_2cc4_4a99_b12d_26e4d436acca.slice/crio-d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840 WatchSource:0}: Error finding container d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840: Status 404 returned error can't find the container with id d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840 Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.080483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerStarted","Data":"d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840"} Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.443107 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.443307 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" containerID="cri-o://9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6" gracePeriod=30 Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.451466 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:36 crc kubenswrapper[4795]: I0320 17:21:36.451756 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" containerID="cri-o://fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5" gracePeriod=30 Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.087103 4795 generic.go:334] "Generic (PLEG): container finished" podID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerID="9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6" exitCode=0 Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.087202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerDied","Data":"9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6"} Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.093208 4795 generic.go:334] "Generic (PLEG): container finished" podID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerID="fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5" exitCode=0 Mar 20 17:21:37 crc kubenswrapper[4795]: I0320 17:21:37.093245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerDied","Data":"fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5"} Mar 20 17:21:39 crc kubenswrapper[4795]: I0320 17:21:39.673129 4795 patch_prober.go:28] interesting pod/route-controller-manager-7fb98dc7f7-g49lt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 17:21:39 crc kubenswrapper[4795]: I0320 17:21:39.673231 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 17:21:41 crc kubenswrapper[4795]: I0320 17:21:41.300265 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:21:41 crc kubenswrapper[4795]: I0320 17:21:41.300363 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:21:41 crc kubenswrapper[4795]: I0320 17:21:41.823431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:21:42 crc kubenswrapper[4795]: I0320 17:21:42.244045 4795 patch_prober.go:28] interesting pod/controller-manager-bbf9678f8-ftr7c container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 20 17:21:42 crc kubenswrapper[4795]: I0320 17:21:42.244108 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 20 17:21:42 crc kubenswrapper[4795]: E0320 17:21:42.970903 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 17:21:42 crc kubenswrapper[4795]: E0320 17:21:42.971368 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2vjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kzvch_openshift-marketplace(fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:42 crc kubenswrapper[4795]: E0320 17:21:42.973240 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kzvch" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" Mar 20 17:21:43 crc kubenswrapper[4795]: I0320 17:21:43.237540 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:21:43 crc kubenswrapper[4795]: I0320 17:21:43.347261 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:21:43 crc kubenswrapper[4795]: I0320 17:21:43.358750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.016149 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kzvch" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.240251 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.240752 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2qsng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hv8kd_openshift-marketplace(7b4d98b5-0434-4a84-b890-d2428de998b7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:47 crc kubenswrapper[4795]: E0320 17:21:47.242024 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hv8kd" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.105607 4795 ???:1] "http: TLS handshake error from 192.168.126.11:39038: no serving certificate available for the kubelet" Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.736247 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hv8kd" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" Mar 20 17:21:48 crc kubenswrapper[4795]: W0320 17:21:48.738520 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8dc34161_d5d0_4580_88a1_c5e2b55c924d.slice/crio-72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6 WatchSource:0}: Error finding container 72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6: Status 404 returned error can't find the container with id 72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6 Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.890290 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.910988 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.912780 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpp4c"] Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924273 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.924525 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924602 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.924620 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" containerName="collect-profiles" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924778 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" containerName="collect-profiles" Mar 20 17:21:48 crc kubenswrapper[4795]: E0320 17:21:48.924796 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.924806 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925325 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" containerName="route-controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925345 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" containerName="controller-manager" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925362 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" containerName="collect-profiles" Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.925976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:48 crc kubenswrapper[4795]: W0320 17:21:48.936420 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996ef79e_1d5b_4e1b_b1f0_efd1ca2c9a77.slice/crio-e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd WatchSource:0}: Error finding container e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd: Status 404 returned error can't find the container with id e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd Mar 20 17:21:48 crc kubenswrapper[4795]: I0320 17:21:48.938786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") pod \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\" (UID: \"4c425d50-cbc6-4fa3-b286-ef1b8d696198\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040881 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.040920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") pod \"5317d308-31fb-4863-bf91-5ba6a632ba67\" (UID: \"5317d308-31fb-4863-bf91-5ba6a632ba67\") " Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.041190 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043046 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca" (OuterVolumeSpecName: "client-ca") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043058 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config" (OuterVolumeSpecName: "config") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config" (OuterVolumeSpecName: "config") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.043918 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046618 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046658 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9" (OuterVolumeSpecName: "kube-api-access-jc2m9") pod "5317d308-31fb-4863-bf91-5ba6a632ba67" (UID: "5317d308-31fb-4863-bf91-5ba6a632ba67"). InnerVolumeSpecName "kube-api-access-jc2m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.046941 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px" (OuterVolumeSpecName: "kube-api-access-ww6px") pod "4c425d50-cbc6-4fa3-b286-ef1b8d696198" (UID: "4c425d50-cbc6-4fa3-b286-ef1b8d696198"). InnerVolumeSpecName "kube-api-access-ww6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142249 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142358 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c425d50-cbc6-4fa3-b286-ef1b8d696198-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142369 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142378 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142389 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6px\" (UniqueName: \"kubernetes.io/projected/4c425d50-cbc6-4fa3-b286-ef1b8d696198-kube-api-access-ww6px\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142398 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142406 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c425d50-cbc6-4fa3-b286-ef1b8d696198-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142414 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5317d308-31fb-4863-bf91-5ba6a632ba67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142424 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc2m9\" (UniqueName: \"kubernetes.io/projected/5317d308-31fb-4863-bf91-5ba6a632ba67-kube-api-access-jc2m9\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.142432 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5317d308-31fb-4863-bf91-5ba6a632ba67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.143087 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.143477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.146207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.159522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"route-controller-manager-b7bc44c6c-lndxp\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.159846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdc3472c-58bf-4b57-aa00-34677fc42e06","Type":"ContainerStarted","Data":"cdf769cace6742ee2c32cad7bdf9fa8b4a0644fb379f47fb111fd1d463a2ab13"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.162587 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.162580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt" event={"ID":"4c425d50-cbc6-4fa3-b286-ef1b8d696198","Type":"ContainerDied","Data":"6092fd1860a57c1ca6f62820e564b6ec08d02b8c7829faf4d750183f7837476f"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.163001 4795 scope.go:117] "RemoveContainer" containerID="fafe861a908f21e3bae7c524d1594f030ac1c3cab2621b45672111e5737afdd5" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.164736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" event={"ID":"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77","Type":"ContainerStarted","Data":"e2a7977d25bb2572a82872c249f249b3bcd9caade6c08d5f54b5f1f678db3ddd"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.167488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc34161-d5d0-4580-88a1-c5e2b55c924d","Type":"ContainerStarted","Data":"72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.168952 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" event={"ID":"5317d308-31fb-4863-bf91-5ba6a632ba67","Type":"ContainerDied","Data":"27fc0f6af22544b7220dee278277b66ac2ca97b4247b7db43a6f4fefc1e35570"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.169028 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbf9678f8-ftr7c" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.172128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerStarted","Data":"d58b7f5b37a6a35ef39d3d8b6ebcad0e2da7e5425eb92dcc45ec68fe40722e18"} Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.190579 4795 scope.go:117] "RemoveContainer" containerID="9c453f9ce1a481a89895009d4dd065491b2d55038377ff11c4e29a62f9828eb6" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.196436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.199282 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb98dc7f7-g49lt"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.209060 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.213844 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bbf9678f8-ftr7c"] Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.263523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c425d50-cbc6-4fa3-b286-ef1b8d696198" path="/var/lib/kubelet/pods/4c425d50-cbc6-4fa3-b286-ef1b8d696198/volumes" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.264277 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5317d308-31fb-4863-bf91-5ba6a632ba67" path="/var/lib/kubelet/pods/5317d308-31fb-4863-bf91-5ba6a632ba67/volumes" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.304524 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.440269 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.440730 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgvm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ht4zv_openshift-marketplace(70000016-e928-4b11-a31d-4d08e9450a1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.441942 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ht4zv" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.550573 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.689065 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.689233 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6zsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kk5rk_openshift-marketplace(57849322-f280-42ee-a330-18120aeed5db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:21:49 crc kubenswrapper[4795]: E0320 17:21:49.691228 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kk5rk" podUID="57849322-f280-42ee-a330-18120aeed5db" Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.763805 4795 csr.go:261] certificate signing request csr-8w7w5 is approved, waiting to be issued Mar 20 17:21:49 crc kubenswrapper[4795]: I0320 17:21:49.771003 4795 csr.go:257] certificate signing request csr-8w7w5 is issued Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.180208 4795 generic.go:334] "Generic (PLEG): container finished" podID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.180564 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.185209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerStarted","Data":"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.189442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" event={"ID":"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77","Type":"ContainerStarted","Data":"54240c5223551154fcb65b1918dadaadd6a2548bed4a555bb8ff6486ab7d05e0"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.189484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpp4c" event={"ID":"996ef79e-1d5b-4e1b-b1f0-efd1ca2c9a77","Type":"ContainerStarted","Data":"b089f2b080867dc9421a7a3c2eaaa26fac7817cf5a7db10470ab8773973f1414"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.193610 4795 generic.go:334] "Generic (PLEG): container finished" podID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerID="76aa98549ce46db60ce0a3b7fd4c6b9ed28e4c1b7375fc84abcdb33fcf4ef287" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.193727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-j7789" event={"ID":"bed1d31b-b060-45c3-95bf-3b226a36efe1","Type":"ContainerDied","Data":"76aa98549ce46db60ce0a3b7fd4c6b9ed28e4c1b7375fc84abcdb33fcf4ef287"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.201400 4795 generic.go:334] "Generic (PLEG): container finished" podID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerID="898e9ddb331a961041951b7bb1edfb2abf5db69d1009da036bfe796e8579e1e3" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.201452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"898e9ddb331a961041951b7bb1edfb2abf5db69d1009da036bfe796e8579e1e3"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.213968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerStarted","Data":"7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.218143 4795 generic.go:334] "Generic (PLEG): container finished" podID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerID="b2343ae78e6e4067fd48d7e9e2379803526af28d2e60ae47d7c80ca51f8a9546" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.218207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdc3472c-58bf-4b57-aa00-34677fc42e06","Type":"ContainerDied","Data":"b2343ae78e6e4067fd48d7e9e2379803526af28d2e60ae47d7c80ca51f8a9546"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.219931 4795 generic.go:334] "Generic (PLEG): container finished" podID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerID="6136ff7225ac1cecd4f6d0e6322199086bbed28ae9106bcb0c283d76cfa16319" exitCode=0 Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.219973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc34161-d5d0-4580-88a1-c5e2b55c924d","Type":"ContainerDied","Data":"6136ff7225ac1cecd4f6d0e6322199086bbed28ae9106bcb0c283d76cfa16319"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.222391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerStarted","Data":"de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0"} Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.222551 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.222565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerStarted","Data":"d97024fb9f1af8ccac26b9f9b512b0db9ba1fd92cda4860946d1b691addcb115"} Mar 20 17:21:50 crc kubenswrapper[4795]: E0320 17:21:50.225215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kk5rk" podUID="57849322-f280-42ee-a330-18120aeed5db" Mar 20 17:21:50 crc kubenswrapper[4795]: E0320 17:21:50.228549 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ht4zv" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.229656 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jpp4c" podStartSLOduration=197.229638693 podStartE2EDuration="3m17.229638693s" podCreationTimestamp="2026-03-20 17:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:50.215037768 +0000 UTC m=+253.673069309" watchObservedRunningTime="2026-03-20 17:21:50.229638693 +0000 UTC m=+253.687670244" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.235530 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.361747 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" podStartSLOduration=14.361733688 podStartE2EDuration="14.361733688s" podCreationTimestamp="2026-03-20 17:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:50.358321055 +0000 UTC m=+253.816352586" watchObservedRunningTime="2026-03-20 17:21:50.361733688 +0000 UTC m=+253.819765229" Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.772384 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 22:59:29.608681624 +0000 UTC Mar 20 17:21:50 crc kubenswrapper[4795]: I0320 17:21:50.772424 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6869h37m38.836260843s for next certificate rotation Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.230952 4795 generic.go:334] "Generic (PLEG): container finished" podID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" exitCode=0 Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.231041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d"} Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.232580 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerID="7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc" exitCode=0 Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.233432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc"} Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.529227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.607592 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.615714 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.675371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") pod \"cdc3472c-58bf-4b57-aa00-34677fc42e06\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.675521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") pod \"cdc3472c-58bf-4b57-aa00-34677fc42e06\" (UID: \"cdc3472c-58bf-4b57-aa00-34677fc42e06\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.675872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cdc3472c-58bf-4b57-aa00-34677fc42e06" (UID: "cdc3472c-58bf-4b57-aa00-34677fc42e06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.681073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cdc3472c-58bf-4b57-aa00-34677fc42e06" (UID: "cdc3472c-58bf-4b57-aa00-34677fc42e06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") pod \"bed1d31b-b060-45c3-95bf-3b226a36efe1\" (UID: \"bed1d31b-b060-45c3-95bf-3b226a36efe1\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776406 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") pod \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776452 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") pod \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\" (UID: \"8dc34161-d5d0-4580-88a1-c5e2b55c924d\") " Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776713 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdc3472c-58bf-4b57-aa00-34677fc42e06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776712 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8dc34161-d5d0-4580-88a1-c5e2b55c924d" (UID: "8dc34161-d5d0-4580-88a1-c5e2b55c924d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.776730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdc3472c-58bf-4b57-aa00-34677fc42e06-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.781145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8dc34161-d5d0-4580-88a1-c5e2b55c924d" (UID: "8dc34161-d5d0-4580-88a1-c5e2b55c924d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.781698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5" (OuterVolumeSpecName: "kube-api-access-9fwn5") pod "bed1d31b-b060-45c3-95bf-3b226a36efe1" (UID: "bed1d31b-b060-45c3-95bf-3b226a36efe1"). InnerVolumeSpecName "kube-api-access-9fwn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.878097 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.878132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc34161-d5d0-4580-88a1-c5e2b55c924d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.878146 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwn5\" (UniqueName: \"kubernetes.io/projected/bed1d31b-b060-45c3-95bf-3b226a36efe1-kube-api-access-9fwn5\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917317 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:51 crc kubenswrapper[4795]: E0320 17:21:51.917528 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917540 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: E0320 17:21:51.917549 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917556 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: E0320 17:21:51.917569 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerName="oc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917575 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerName="oc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc34161-d5d0-4580-88a1-c5e2b55c924d" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917701 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" containerName="oc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.917711 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc3472c-58bf-4b57-aa00-34677fc42e06" containerName="pruner" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.918046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.920919 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921034 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921137 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.921345 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.925424 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:51 crc kubenswrapper[4795]: I0320 17:21:51.929945 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.080862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.080908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.080975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.081027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.081072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.182901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.183864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.184913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.186531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.187155 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.198347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"controller-manager-5cf4864966-2hknc\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.242574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.247674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerStarted","Data":"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.249328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.249372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc34161-d5d0-4580-88a1-c5e2b55c924d","Type":"ContainerDied","Data":"72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.249454 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72290a73b47fa2bb15c0cf08ade17f60e80e4cdde42e7932d2ee4e45e3523ac6" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-j7789" event={"ID":"bed1d31b-b060-45c3-95bf-3b226a36efe1","Type":"ContainerDied","Data":"dd9d5f9731ec60032210cdc180eb41d5f236e29f6e6729daa332365615c09023"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262184 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd9d5f9731ec60032210cdc180eb41d5f236e29f6e6729daa332365615c09023" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-j7789" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.262853 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4492" podStartSLOduration=6.918039605 podStartE2EDuration="29.262836416s" podCreationTimestamp="2026-03-20 17:21:23 +0000 UTC" firstStartedPulling="2026-03-20 17:21:29.403385631 +0000 UTC m=+232.861417172" lastFinishedPulling="2026-03-20 17:21:51.748182442 +0000 UTC m=+255.206213983" observedRunningTime="2026-03-20 17:21:52.260931898 +0000 UTC m=+255.718963439" watchObservedRunningTime="2026-03-20 17:21:52.262836416 +0000 UTC m=+255.720867957" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.266577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerStarted","Data":"ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.271364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdc3472c-58bf-4b57-aa00-34677fc42e06","Type":"ContainerDied","Data":"cdf769cace6742ee2c32cad7bdf9fa8b4a0644fb379f47fb111fd1d463a2ab13"} Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.271390 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.271392 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf769cace6742ee2c32cad7bdf9fa8b4a0644fb379f47fb111fd1d463a2ab13" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.289258 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6vnf" podStartSLOduration=2.56873014 podStartE2EDuration="31.289234067s" podCreationTimestamp="2026-03-20 17:21:21 +0000 UTC" firstStartedPulling="2026-03-20 17:21:22.9656045 +0000 UTC m=+226.423636051" lastFinishedPulling="2026-03-20 17:21:51.686108437 +0000 UTC m=+255.144139978" observedRunningTime="2026-03-20 17:21:52.279489036 +0000 UTC m=+255.737520567" watchObservedRunningTime="2026-03-20 17:21:52.289234067 +0000 UTC m=+255.747265608" Mar 20 17:21:52 crc kubenswrapper[4795]: I0320 17:21:52.463199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:52 crc kubenswrapper[4795]: W0320 17:21:52.474446 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9668af_0fcd_484b_a4dd_929c06088636.slice/crio-9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da WatchSource:0}: Error finding container 9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da: Status 404 returned error can't find the container with id 9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.282146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerStarted","Data":"75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825"} Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.282428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerStarted","Data":"9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da"} Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.296265 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" podStartSLOduration=17.296234513 podStartE2EDuration="17.296234513s" podCreationTimestamp="2026-03-20 17:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:53.295168914 +0000 UTC m=+256.753200455" watchObservedRunningTime="2026-03-20 17:21:53.296234513 +0000 UTC m=+256.754266054" Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.525566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:53 crc kubenswrapper[4795]: I0320 17:21:53.525984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.287350 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.292214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.685370 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-x4492" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" probeResult="failure" output=< Mar 20 17:21:54 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:21:54 crc kubenswrapper[4795]: > Mar 20 17:21:54 crc kubenswrapper[4795]: I0320 17:21:54.693182 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-qj44k" Mar 20 17:21:56 crc kubenswrapper[4795]: I0320 17:21:56.461379 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:21:56 crc kubenswrapper[4795]: I0320 17:21:56.544904 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:21:56 crc kubenswrapper[4795]: I0320 17:21:56.545487 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" containerID="cri-o://de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0" gracePeriod=30 Mar 20 17:21:57 crc kubenswrapper[4795]: I0320 17:21:57.307336 4795 generic.go:334] "Generic (PLEG): container finished" podID="af030096-8488-42df-be2c-a39b58ff0612" containerID="de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0" exitCode=0 Mar 20 17:21:57 crc kubenswrapper[4795]: I0320 17:21:57.307429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerDied","Data":"de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0"} Mar 20 17:21:57 crc kubenswrapper[4795]: I0320 17:21:57.307915 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" containerID="cri-o://75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825" gracePeriod=30 Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.314289 4795 generic.go:334] "Generic (PLEG): container finished" podID="cb9668af-0fcd-484b-a4dd-929c06088636" containerID="75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825" exitCode=0 Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.314339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerDied","Data":"75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825"} Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.511055 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.511921 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.513406 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.513893 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.523313 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.673824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.673930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.775442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.775521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.775740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.795223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:58 crc kubenswrapper[4795]: I0320 17:21:58.843727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:21:59 crc kubenswrapper[4795]: I0320 17:21:59.305502 4795 patch_prober.go:28] interesting pod/route-controller-manager-b7bc44c6c-lndxp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 20 17:21:59 crc kubenswrapper[4795]: I0320 17:21:59.305580 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.127797 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.131109 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.131199 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.134351 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.134421 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.134621 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.295434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"auto-csr-approver-29567122-fns4l\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.296016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.324239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" event={"ID":"af030096-8488-42df-be2c-a39b58ff0612","Type":"ContainerDied","Data":"d97024fb9f1af8ccac26b9f9b512b0db9ba1fd92cda4860946d1b691addcb115"} Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.324293 4795 scope.go:117] "RemoveContainer" containerID="de73514b8f320ab741cc124ca96503c98c90527d123632a693748372593ae1e0" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.324402 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerStarted","Data":"ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b"} Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327463 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:00 crc kubenswrapper[4795]: E0320 17:22:00.327737 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327750 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.327852 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="af030096-8488-42df-be2c-a39b58ff0612" containerName="route-controller-manager" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.328190 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.337358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.343534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerStarted","Data":"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa"} Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.368897 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396165 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") pod \"af030096-8488-42df-be2c-a39b58ff0612\" (UID: \"af030096-8488-42df-be2c-a39b58ff0612\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.396367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"auto-csr-approver-29567122-fns4l\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.397515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca" (OuterVolumeSpecName: "client-ca") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.397570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config" (OuterVolumeSpecName: "config") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.405657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r" (OuterVolumeSpecName: "kube-api-access-69x8r") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "kube-api-access-69x8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.415059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"auto-csr-approver-29567122-fns4l\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.419956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af030096-8488-42df-be2c-a39b58ff0612" (UID: "af030096-8488-42df-be2c-a39b58ff0612"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.464112 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.489916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.498981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499283 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") pod \"cb9668af-0fcd-484b-a4dd-929c06088636\" (UID: \"cb9668af-0fcd-484b-a4dd-929c06088636\") " Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499624 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69x8r\" (UniqueName: \"kubernetes.io/projected/af030096-8488-42df-be2c-a39b58ff0612-kube-api-access-69x8r\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499634 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499643 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af030096-8488-42df-be2c-a39b58ff0612-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.499651 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af030096-8488-42df-be2c-a39b58ff0612-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.500956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.501306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.501759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config" (OuterVolumeSpecName: "config") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.509846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx" (OuterVolumeSpecName: "kube-api-access-57hzx") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "kube-api-access-57hzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.509948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb9668af-0fcd-484b-a4dd-929c06088636" (UID: "cb9668af-0fcd-484b-a4dd-929c06088636"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.602900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603662 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb9668af-0fcd-484b-a4dd-929c06088636-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603700 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603717 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603728 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb9668af-0fcd-484b-a4dd-929c06088636-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.603744 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57hzx\" (UniqueName: \"kubernetes.io/projected/cb9668af-0fcd-484b-a4dd-929c06088636-kube-api-access-57hzx\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.622730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.623285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.623594 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.626478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"route-controller-manager-5c67c6b48f-qm658\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.645061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.666709 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.674827 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7bc44c6c-lndxp"] Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.824587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:00 crc kubenswrapper[4795]: W0320 17:22:00.831157 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26689f8_7057_45ba_8d53_ae4623ecd2e9.slice/crio-48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b WatchSource:0}: Error finding container 48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b: Status 404 returned error can't find the container with id 48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b Mar 20 17:22:00 crc kubenswrapper[4795]: I0320 17:22:00.901201 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:22:00 crc kubenswrapper[4795]: W0320 17:22:00.905736 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0486c12_c384_46ff_925b_bfeefb1d59bb.slice/crio-265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a WatchSource:0}: Error finding container 265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a: Status 404 returned error can't find the container with id 265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.269302 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af030096-8488-42df-be2c-a39b58ff0612" path="/var/lib/kubelet/pods/af030096-8488-42df-be2c-a39b58ff0612/volumes" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.351586 4795 generic.go:334] "Generic (PLEG): container finished" podID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerID="ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b" exitCode=0 Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.351783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.353958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" event={"ID":"cb9668af-0fcd-484b-a4dd-929c06088636","Type":"ContainerDied","Data":"9e0445907ee72ebdc97c6462b2a382e515c86153ff2b1c9bb2b03cde9237a5da"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.354042 4795 scope.go:117] "RemoveContainer" containerID="75085cc0e6ae210bc7cdac642ecf687ee44273af1075dbe71a7eac4507b22825" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.353970 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cf4864966-2hknc" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.355852 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-fns4l" event={"ID":"a0486c12-c384-46ff-925b-bfeefb1d59bb","Type":"ContainerStarted","Data":"265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.358632 4795 generic.go:334] "Generic (PLEG): container finished" podID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" exitCode=0 Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.358746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.362804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerStarted","Data":"1165a4e47d8fe5ba2e9d8af8b5728fb3bb7d991766808000e075dbc69758730b"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.362849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerStarted","Data":"c56b30507509c740a94c11008112ae36e1c890764e38285ed7897648547f8686"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.365053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerStarted","Data":"48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b"} Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.401645 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.401625939 podStartE2EDuration="3.401625939s" podCreationTimestamp="2026-03-20 17:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:01.398618575 +0000 UTC m=+264.856650156" watchObservedRunningTime="2026-03-20 17:22:01.401625939 +0000 UTC m=+264.859657490" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.425185 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.435879 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cf4864966-2hknc"] Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.744317 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.744352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:01 crc kubenswrapper[4795]: I0320 17:22:01.809156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.374714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerStarted","Data":"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f"} Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.375301 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.408923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerStarted","Data":"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741"} Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.419202 4795 generic.go:334] "Generic (PLEG): container finished" podID="324189bb-8d17-4759-8902-0e960316a64b" containerID="1165a4e47d8fe5ba2e9d8af8b5728fb3bb7d991766808000e075dbc69758730b" exitCode=0 Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.419319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerDied","Data":"1165a4e47d8fe5ba2e9d8af8b5728fb3bb7d991766808000e075dbc69758730b"} Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.427174 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" podStartSLOduration=6.427158685 podStartE2EDuration="6.427158685s" podCreationTimestamp="2026-03-20 17:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:02.398065852 +0000 UTC m=+265.856097493" watchObservedRunningTime="2026-03-20 17:22:02.427158685 +0000 UTC m=+265.885190236" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.460112 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.876718 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.923662 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:02 crc kubenswrapper[4795]: E0320 17:22:02.923892 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.923905 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.924001 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" containerName="controller-manager" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.924316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.937228 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.937734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.937860 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.938795 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.941231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.941819 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.942087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:02 crc kubenswrapper[4795]: I0320 17:22:02.944526 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.044982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.146651 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.147837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.147947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.148277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.154469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.162540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"controller-manager-6489dcd9dc-4rk52\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.251677 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.259068 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9668af-0fcd-484b-a4dd-929c06088636" path="/var/lib/kubelet/pods/cb9668af-0fcd-484b-a4dd-929c06088636/volumes" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.308584 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.309279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.322345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.426802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerStarted","Data":"c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.428817 4795 generic.go:334] "Generic (PLEG): container finished" podID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerID="14e15a12796f646063cb5f653e99e6ad23f1724726dfb97b08e9621c085665c1" exitCode=0 Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.428918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-fns4l" event={"ID":"a0486c12-c384-46ff-925b-bfeefb1d59bb","Type":"ContainerDied","Data":"14e15a12796f646063cb5f653e99e6ad23f1724726dfb97b08e9621c085665c1"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.431894 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" exitCode=0 Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.431940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.448696 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" exitCode=0 Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.448776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.450195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.450282 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.450315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.453892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerStarted","Data":"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c"} Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.466659 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2m5ld" podStartSLOduration=27.323917933 podStartE2EDuration="39.466639696s" podCreationTimestamp="2026-03-20 17:21:24 +0000 UTC" firstStartedPulling="2026-03-20 17:21:50.206551043 +0000 UTC m=+253.664582584" lastFinishedPulling="2026-03-20 17:22:02.349272806 +0000 UTC m=+265.807304347" observedRunningTime="2026-03-20 17:22:03.44970783 +0000 UTC m=+266.907739371" watchObservedRunningTime="2026-03-20 17:22:03.466639696 +0000 UTC m=+266.924671237" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.525096 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q7czt" podStartSLOduration=28.374026407 podStartE2EDuration="40.525074661s" podCreationTimestamp="2026-03-20 17:21:23 +0000 UTC" firstStartedPulling="2026-03-20 17:21:50.182110852 +0000 UTC m=+253.640142383" lastFinishedPulling="2026-03-20 17:22:02.333159096 +0000 UTC m=+265.791190637" observedRunningTime="2026-03-20 17:22:03.521043676 +0000 UTC m=+266.979075227" watchObservedRunningTime="2026-03-20 17:22:03.525074661 +0000 UTC m=+266.983106212" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.551523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.551789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.551979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.552095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.552100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.564877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.571736 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"installer-9-crc\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.619222 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.667800 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.673333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.684243 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:03 crc kubenswrapper[4795]: W0320 17:22:03.687773 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85b922ea_7281_44f5_b78b_b0ec5d5387d3.slice/crio-7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b WatchSource:0}: Error finding container 7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b: Status 404 returned error can't find the container with id 7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.758936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") pod \"324189bb-8d17-4759-8902-0e960316a64b\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.759268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") pod \"324189bb-8d17-4759-8902-0e960316a64b\" (UID: \"324189bb-8d17-4759-8902-0e960316a64b\") " Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.759068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "324189bb-8d17-4759-8902-0e960316a64b" (UID: "324189bb-8d17-4759-8902-0e960316a64b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.766436 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "324189bb-8d17-4759-8902-0e960316a64b" (UID: "324189bb-8d17-4759-8902-0e960316a64b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.861858 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/324189bb-8d17-4759-8902-0e960316a64b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:03 crc kubenswrapper[4795]: I0320 17:22:03.861889 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/324189bb-8d17-4759-8902-0e960316a64b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.139782 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:22:04 crc kubenswrapper[4795]: W0320 17:22:04.181916 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod49054187_cb30_4f07_b67a_794c2503f50a.slice/crio-bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa WatchSource:0}: Error finding container bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa: Status 404 returned error can't find the container with id bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.323878 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.323928 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.461348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"324189bb-8d17-4759-8902-0e960316a64b","Type":"ContainerDied","Data":"c56b30507509c740a94c11008112ae36e1c890764e38285ed7897648547f8686"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.461388 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56b30507509c740a94c11008112ae36e1c890764e38285ed7897648547f8686" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.462412 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.463291 4795 generic.go:334] "Generic (PLEG): container finished" podID="70000016-e928-4b11-a31d-4d08e9450a1c" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.463343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.465111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerStarted","Data":"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.465153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerStarted","Data":"7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.465497 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.466528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerStarted","Data":"bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.468167 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerStarted","Data":"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.469753 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.470961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerStarted","Data":"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1"} Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.491919 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" podStartSLOduration=8.491904795 podStartE2EDuration="8.491904795s" podCreationTimestamp="2026-03-20 17:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:04.490265394 +0000 UTC m=+267.948296935" watchObservedRunningTime="2026-03-20 17:22:04.491904795 +0000 UTC m=+267.949936346" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.517655 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kzvch" podStartSLOduration=2.295216382 podStartE2EDuration="44.517642095s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="2026-03-20 17:21:21.938448969 +0000 UTC m=+225.396480510" lastFinishedPulling="2026-03-20 17:22:04.160874682 +0000 UTC m=+267.618906223" observedRunningTime="2026-03-20 17:22:04.514996542 +0000 UTC m=+267.973028083" watchObservedRunningTime="2026-03-20 17:22:04.517642095 +0000 UTC m=+267.975673636" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.542619 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hv8kd" podStartSLOduration=2.648201927 podStartE2EDuration="43.5426005s" podCreationTimestamp="2026-03-20 17:21:21 +0000 UTC" firstStartedPulling="2026-03-20 17:21:22.971895947 +0000 UTC m=+226.429927488" lastFinishedPulling="2026-03-20 17:22:03.86629452 +0000 UTC m=+267.324326061" observedRunningTime="2026-03-20 17:22:04.542330231 +0000 UTC m=+268.000361772" watchObservedRunningTime="2026-03-20 17:22:04.5426005 +0000 UTC m=+268.000632031" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.724041 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.749667 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.749733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.882021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") pod \"a0486c12-c384-46ff-925b-bfeefb1d59bb\" (UID: \"a0486c12-c384-46ff-925b-bfeefb1d59bb\") " Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.887965 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2" (OuterVolumeSpecName: "kube-api-access-tcjr2") pod "a0486c12-c384-46ff-925b-bfeefb1d59bb" (UID: "a0486c12-c384-46ff-925b-bfeefb1d59bb"). InnerVolumeSpecName "kube-api-access-tcjr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4795]: I0320 17:22:04.983841 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjr2\" (UniqueName: \"kubernetes.io/projected/a0486c12-c384-46ff-925b-bfeefb1d59bb-kube-api-access-tcjr2\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.362641 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q7czt" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" probeResult="failure" output=< Mar 20 17:22:05 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:22:05 crc kubenswrapper[4795]: > Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.408506 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.408911 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6vnf" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" containerID="cri-o://ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10" gracePeriod=2 Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.477378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-fns4l" event={"ID":"a0486c12-c384-46ff-925b-bfeefb1d59bb","Type":"ContainerDied","Data":"265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.477424 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265692bcf01910557d4333aeafba5871c321c82d259425a86492b5f277de974a" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.477488 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-fns4l" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.478918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerStarted","Data":"30207c849aa57355d0d1027a4dded499a63ab8fad8a6d8162ced0752bd75a382"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.482981 4795 generic.go:334] "Generic (PLEG): container finished" podID="57849322-f280-42ee-a330-18120aeed5db" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" exitCode=0 Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.483020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.486871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerStarted","Data":"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d"} Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.498554 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.498530774 podStartE2EDuration="2.498530774s" podCreationTimestamp="2026-03-20 17:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:05.49740498 +0000 UTC m=+268.955436521" watchObservedRunningTime="2026-03-20 17:22:05.498530774 +0000 UTC m=+268.956562315" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.519834 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ht4zv" podStartSLOduration=2.705297964 podStartE2EDuration="43.519816076s" podCreationTimestamp="2026-03-20 17:21:22 +0000 UTC" firstStartedPulling="2026-03-20 17:21:24.059380699 +0000 UTC m=+227.517412240" lastFinishedPulling="2026-03-20 17:22:04.873898811 +0000 UTC m=+268.331930352" observedRunningTime="2026-03-20 17:22:05.517457423 +0000 UTC m=+268.975488964" watchObservedRunningTime="2026-03-20 17:22:05.519816076 +0000 UTC m=+268.977847617" Mar 20 17:22:05 crc kubenswrapper[4795]: I0320 17:22:05.787948 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2m5ld" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" probeResult="failure" output=< Mar 20 17:22:05 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:22:05 crc kubenswrapper[4795]: > Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.493356 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerID="ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10" exitCode=0 Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.494279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10"} Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.689320 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.806534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") pod \"9b3b1055-857d-4334-b39a-24b0ac9139d1\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.806594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") pod \"9b3b1055-857d-4334-b39a-24b0ac9139d1\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.806674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") pod \"9b3b1055-857d-4334-b39a-24b0ac9139d1\" (UID: \"9b3b1055-857d-4334-b39a-24b0ac9139d1\") " Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.807475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities" (OuterVolumeSpecName: "utilities") pod "9b3b1055-857d-4334-b39a-24b0ac9139d1" (UID: "9b3b1055-857d-4334-b39a-24b0ac9139d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.811940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54" (OuterVolumeSpecName: "kube-api-access-vks54") pod "9b3b1055-857d-4334-b39a-24b0ac9139d1" (UID: "9b3b1055-857d-4334-b39a-24b0ac9139d1"). InnerVolumeSpecName "kube-api-access-vks54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.857480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b3b1055-857d-4334-b39a-24b0ac9139d1" (UID: "9b3b1055-857d-4334-b39a-24b0ac9139d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.907645 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.907692 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b3b1055-857d-4334-b39a-24b0ac9139d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:06 crc kubenswrapper[4795]: I0320 17:22:06.907704 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vks54\" (UniqueName: \"kubernetes.io/projected/9b3b1055-857d-4334-b39a-24b0ac9139d1-kube-api-access-vks54\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.502080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6vnf" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.502087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6vnf" event={"ID":"9b3b1055-857d-4334-b39a-24b0ac9139d1","Type":"ContainerDied","Data":"1491707e1e9a76fca81ebcac286e8ae28085443c9768ed4379919b2d9e23dca3"} Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.502246 4795 scope.go:117] "RemoveContainer" containerID="ccf80d1c25b4b2ca8dbba9a5886768a507ee7b4accfd07c29683288376099b10" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.504625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerStarted","Data":"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233"} Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.522191 4795 scope.go:117] "RemoveContainer" containerID="7ec8b989964af7b47bbfe9e2ba650db31b2b4a3194973fb0444b0a3b977d10bc" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.529626 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk5rk" podStartSLOduration=3.468618398 podStartE2EDuration="47.529602088s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="2026-03-20 17:21:22.97421773 +0000 UTC m=+226.432249271" lastFinishedPulling="2026-03-20 17:22:07.03520142 +0000 UTC m=+270.493232961" observedRunningTime="2026-03-20 17:22:07.524917052 +0000 UTC m=+270.982948593" watchObservedRunningTime="2026-03-20 17:22:07.529602088 +0000 UTC m=+270.987633629" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.540508 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.541475 4795 scope.go:117] "RemoveContainer" containerID="82d8d3d7e1e3eb80a041eea63e969e0d1aa9af7af1cccc7c6c9f3460b4809935" Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.547068 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6vnf"] Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.609110 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:22:07 crc kubenswrapper[4795]: I0320 17:22:07.609378 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4492" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" containerID="cri-o://89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" gracePeriod=2 Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.121128 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.269988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") pod \"366eee86-1ca2-4662-b32d-c00d4c1d513f\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.270082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") pod \"366eee86-1ca2-4662-b32d-c00d4c1d513f\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.270104 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") pod \"366eee86-1ca2-4662-b32d-c00d4c1d513f\" (UID: \"366eee86-1ca2-4662-b32d-c00d4c1d513f\") " Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.271011 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities" (OuterVolumeSpecName: "utilities") pod "366eee86-1ca2-4662-b32d-c00d4c1d513f" (UID: "366eee86-1ca2-4662-b32d-c00d4c1d513f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.276779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp" (OuterVolumeSpecName: "kube-api-access-js4cp") pod "366eee86-1ca2-4662-b32d-c00d4c1d513f" (UID: "366eee86-1ca2-4662-b32d-c00d4c1d513f"). InnerVolumeSpecName "kube-api-access-js4cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.301954 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "366eee86-1ca2-4662-b32d-c00d4c1d513f" (UID: "366eee86-1ca2-4662-b32d-c00d4c1d513f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.371189 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.371221 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/366eee86-1ca2-4662-b32d-c00d4c1d513f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.371231 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js4cp\" (UniqueName: \"kubernetes.io/projected/366eee86-1ca2-4662-b32d-c00d4c1d513f-kube-api-access-js4cp\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511059 4795 generic.go:334] "Generic (PLEG): container finished" podID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" exitCode=0 Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4492" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784"} Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4492" event={"ID":"366eee86-1ca2-4662-b32d-c00d4c1d513f","Type":"ContainerDied","Data":"0bf34893ceb2a123dbae4a13fdf9053d4d9c1472bbfe52b966a8795f5fc54346"} Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.511344 4795 scope.go:117] "RemoveContainer" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.525718 4795 scope.go:117] "RemoveContainer" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.537073 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.543800 4795 scope.go:117] "RemoveContainer" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.545680 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4492"] Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.556608 4795 scope.go:117] "RemoveContainer" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" Mar 20 17:22:08 crc kubenswrapper[4795]: E0320 17:22:08.557014 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784\": container with ID starting with 89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784 not found: ID does not exist" containerID="89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557043 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784"} err="failed to get container status \"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784\": rpc error: code = NotFound desc = could not find container \"89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784\": container with ID starting with 89d40540ddd56877fb3f33765b8bd951e9d2222a403f8c24e743d5e9082c0784 not found: ID does not exist" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557064 4795 scope.go:117] "RemoveContainer" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" Mar 20 17:22:08 crc kubenswrapper[4795]: E0320 17:22:08.557366 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d\": container with ID starting with f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d not found: ID does not exist" containerID="f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d"} err="failed to get container status \"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d\": rpc error: code = NotFound desc = could not find container \"f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d\": container with ID starting with f9e6e6b9b878ab03bab9bb879bdaa42953a57473af862d71aae3a3242f88162d not found: ID does not exist" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557456 4795 scope.go:117] "RemoveContainer" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" Mar 20 17:22:08 crc kubenswrapper[4795]: E0320 17:22:08.557841 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf\": container with ID starting with 356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf not found: ID does not exist" containerID="356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf" Mar 20 17:22:08 crc kubenswrapper[4795]: I0320 17:22:08.557893 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf"} err="failed to get container status \"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf\": rpc error: code = NotFound desc = could not find container \"356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf\": container with ID starting with 356ba9e23ab164833d69a526ed903dd6c4acd1ccc2d17f09310b99677f6d9abf not found: ID does not exist" Mar 20 17:22:09 crc kubenswrapper[4795]: I0320 17:22:09.258725 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" path="/var/lib/kubelet/pods/366eee86-1ca2-4662-b32d-c00d4c1d513f/volumes" Mar 20 17:22:09 crc kubenswrapper[4795]: I0320 17:22:09.259674 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" path="/var/lib/kubelet/pods/9b3b1055-857d-4334-b39a-24b0ac9139d1/volumes" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.105278 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.105331 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.171189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.300708 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.300769 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.300817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.301209 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.301271 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506" gracePeriod=600 Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.303623 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.304019 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.357175 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.529945 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.530593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.571237 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:11 crc kubenswrapper[4795]: I0320 17:22:11.580372 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:22:12 crc kubenswrapper[4795]: I0320 17:22:12.551940 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506" exitCode=0 Mar 20 17:22:12 crc kubenswrapper[4795]: I0320 17:22:12.552018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506"} Mar 20 17:22:12 crc kubenswrapper[4795]: I0320 17:22:12.612076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.135399 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.135764 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.191321 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.558828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b"} Mar 20 17:22:13 crc kubenswrapper[4795]: I0320 17:22:13.608638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.013508 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.368536 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.468073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.801566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:14 crc kubenswrapper[4795]: I0320 17:22:14.852001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:15 crc kubenswrapper[4795]: I0320 17:22:15.565865 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hv8kd" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" containerID="cri-o://a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" gracePeriod=2 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.060049 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.103899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") pod \"7b4d98b5-0434-4a84-b890-d2428de998b7\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.104000 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") pod \"7b4d98b5-0434-4a84-b890-d2428de998b7\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.104144 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") pod \"7b4d98b5-0434-4a84-b890-d2428de998b7\" (UID: \"7b4d98b5-0434-4a84-b890-d2428de998b7\") " Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.108837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities" (OuterVolumeSpecName: "utilities") pod "7b4d98b5-0434-4a84-b890-d2428de998b7" (UID: "7b4d98b5-0434-4a84-b890-d2428de998b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.109837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng" (OuterVolumeSpecName: "kube-api-access-2qsng") pod "7b4d98b5-0434-4a84-b890-d2428de998b7" (UID: "7b4d98b5-0434-4a84-b890-d2428de998b7"). InnerVolumeSpecName "kube-api-access-2qsng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.155056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b4d98b5-0434-4a84-b890-d2428de998b7" (UID: "7b4d98b5-0434-4a84-b890-d2428de998b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.205893 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.205929 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b4d98b5-0434-4a84-b890-d2428de998b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.205939 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qsng\" (UniqueName: \"kubernetes.io/projected/7b4d98b5-0434-4a84-b890-d2428de998b7-kube-api-access-2qsng\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.463877 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.464459 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" containerID="cri-o://9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" gracePeriod=30 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.481123 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.481411 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" containerID="cri-o://46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" gracePeriod=30 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572637 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" exitCode=0 Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a"} Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv8kd" event={"ID":"7b4d98b5-0434-4a84-b890-d2428de998b7","Type":"ContainerDied","Data":"2e4b95450b8315d24d755f1e11d49c142fbb519e0e3cc346ea05ef39f9bff4b9"} Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572741 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv8kd" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.572764 4795 scope.go:117] "RemoveContainer" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.697348 4795 scope.go:117] "RemoveContainer" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.723771 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.726913 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hv8kd"] Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.741811 4795 scope.go:117] "RemoveContainer" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.753407 4795 scope.go:117] "RemoveContainer" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" Mar 20 17:22:16 crc kubenswrapper[4795]: E0320 17:22:16.753749 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a\": container with ID starting with a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a not found: ID does not exist" containerID="a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.753788 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a"} err="failed to get container status \"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a\": rpc error: code = NotFound desc = could not find container \"a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a\": container with ID starting with a0a4adc91005601709328ab2b44123e57198fc5e851bcf99d0426e7a5bb3ce4a not found: ID does not exist" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.753815 4795 scope.go:117] "RemoveContainer" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" Mar 20 17:22:16 crc kubenswrapper[4795]: E0320 17:22:16.754164 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c\": container with ID starting with 383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c not found: ID does not exist" containerID="383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.754191 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c"} err="failed to get container status \"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c\": rpc error: code = NotFound desc = could not find container \"383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c\": container with ID starting with 383d70f53dee18e945768f0332046b8b36adb72ba68e363a31ec37039f62d95c not found: ID does not exist" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.754207 4795 scope.go:117] "RemoveContainer" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" Mar 20 17:22:16 crc kubenswrapper[4795]: E0320 17:22:16.754475 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6\": container with ID starting with 051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6 not found: ID does not exist" containerID="051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6" Mar 20 17:22:16 crc kubenswrapper[4795]: I0320 17:22:16.754492 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6"} err="failed to get container status \"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6\": rpc error: code = NotFound desc = could not find container \"051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6\": container with ID starting with 051c26c537c1aa3fbccd3f7da3d785f560f24f13b325ffaccecf9a7dceb260d6 not found: ID does not exist" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.023830 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.055783 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.118963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119080 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") pod \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\" (UID: \"d26689f8-7057-45ba-8d53-ae4623ecd2e9\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119202 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") pod \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\" (UID: \"85b922ea-7281-44f5-b78b-b0ec5d5387d3\") " Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119705 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119887 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119901 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config" (OuterVolumeSpecName: "config") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.119961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.120041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config" (OuterVolumeSpecName: "config") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.123410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.123416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.123515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh" (OuterVolumeSpecName: "kube-api-access-tljdh") pod "85b922ea-7281-44f5-b78b-b0ec5d5387d3" (UID: "85b922ea-7281-44f5-b78b-b0ec5d5387d3"). InnerVolumeSpecName "kube-api-access-tljdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.127807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls" (OuterVolumeSpecName: "kube-api-access-bdfls") pod "d26689f8-7057-45ba-8d53-ae4623ecd2e9" (UID: "d26689f8-7057-45ba-8d53-ae4623ecd2e9"). InnerVolumeSpecName "kube-api-access-bdfls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220829 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220865 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tljdh\" (UniqueName: \"kubernetes.io/projected/85b922ea-7281-44f5-b78b-b0ec5d5387d3-kube-api-access-tljdh\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220876 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdfls\" (UniqueName: \"kubernetes.io/projected/d26689f8-7057-45ba-8d53-ae4623ecd2e9-kube-api-access-bdfls\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220885 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220893 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220902 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26689f8-7057-45ba-8d53-ae4623ecd2e9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220909 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85b922ea-7281-44f5-b78b-b0ec5d5387d3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220917 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d26689f8-7057-45ba-8d53-ae4623ecd2e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.220926 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b922ea-7281-44f5-b78b-b0ec5d5387d3-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.262001 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" path="/var/lib/kubelet/pods/7b4d98b5-0434-4a84-b890-d2428de998b7/volumes" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580175 4795 generic.go:334] "Generic (PLEG): container finished" podID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" exitCode=0 Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerDied","Data":"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658" event={"ID":"d26689f8-7057-45ba-8d53-ae4623ecd2e9","Type":"ContainerDied","Data":"48227db9a6937ae60b0746c4afb05ece27c6be8da72af95842f2494d449cc21b"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.580670 4795 scope.go:117] "RemoveContainer" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585738 4795 generic.go:334] "Generic (PLEG): container finished" podID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" exitCode=0 Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerDied","Data":"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.585929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6489dcd9dc-4rk52" event={"ID":"85b922ea-7281-44f5-b78b-b0ec5d5387d3","Type":"ContainerDied","Data":"7b73ca2a9ce585e0f34c21ad9d5d09f64d1d3d058ddcee892f2d60850aae971b"} Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.609822 4795 scope.go:117] "RemoveContainer" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.611312 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f\": container with ID starting with 46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f not found: ID does not exist" containerID="46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.611386 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f"} err="failed to get container status \"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f\": rpc error: code = NotFound desc = could not find container \"46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f\": container with ID starting with 46c61d8882008b6b5da15c7bee97deab3222a4d1bea9ad0180f506740843ce4f not found: ID does not exist" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.611443 4795 scope.go:117] "RemoveContainer" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.624260 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.629082 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c67c6b48f-qm658"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.638896 4795 scope.go:117] "RemoveContainer" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.639460 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b\": container with ID starting with 9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b not found: ID does not exist" containerID="9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.639491 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b"} err="failed to get container status \"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b\": rpc error: code = NotFound desc = could not find container \"9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b\": container with ID starting with 9e6e477a2e14827cae9492d7ee9167d83350edf8588cada9352c48fcc2b7ee9b not found: ID does not exist" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.645843 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.649794 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6489dcd9dc-4rk52"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.942839 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.943409 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.943815 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.944269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324189bb-8d17-4759-8902-0e960316a64b" containerName="pruner" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.944450 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="324189bb-8d17-4759-8902-0e960316a64b" containerName="pruner" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.944603 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.944756 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.944880 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.945031 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.946509 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.946883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.947094 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.947591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.947928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerName="oc" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.948137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerName="oc" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.948587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.949929 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.950150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.951105 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.951312 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.951517 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="extract-content" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.951769 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.952269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952377 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: E0320 17:22:17.952488 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952597 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="extract-utilities" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.952978 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" containerName="controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953115 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" containerName="route-controller-manager" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="366eee86-1ca2-4662-b32d-c00d4c1d513f" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953417 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" containerName="oc" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953527 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="324189bb-8d17-4759-8902-0e960316a64b" containerName="pruner" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953644 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b3b1055-857d-4334-b39a-24b0ac9139d1" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.953833 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4d98b5-0434-4a84-b890-d2428de998b7" containerName="registry-server" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.954748 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.954934 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.955893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.955934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.956017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.957086 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.957991 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958517 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958717 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.958967 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959069 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959169 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959536 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.959628 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.960070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.960185 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:22:17 crc kubenswrapper[4795]: I0320 17:22:17.964587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028671 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028733 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028820 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.028996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.130827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.130932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.130972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131170 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.131329 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.132803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.132847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.133089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.133324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.137024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.137730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.139673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.162388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"route-controller-manager-6876bd5d54-rptsb\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.163844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"controller-manager-7844559998-cfclj\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.324445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.333306 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.408872 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.409199 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2m5ld" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" containerID="cri-o://c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a" gracePeriod=2 Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.592704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.598624 4795 generic.go:334] "Generic (PLEG): container finished" podID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerID="c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a" exitCode=0 Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.598677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a"} Mar 20 17:22:18 crc kubenswrapper[4795]: W0320 17:22:18.611844 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d WatchSource:0}: Error finding container 291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d: Status 404 returned error can't find the container with id 291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.769332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.770463 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:18 crc kubenswrapper[4795]: W0320 17:22:18.776809 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7 WatchSource:0}: Error finding container acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7: Status 404 returned error can't find the container with id acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7 Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.938876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") pod \"58cc2d60-9778-460a-bd81-89c8078a4d96\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.939328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") pod \"58cc2d60-9778-460a-bd81-89c8078a4d96\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.939385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") pod \"58cc2d60-9778-460a-bd81-89c8078a4d96\" (UID: \"58cc2d60-9778-460a-bd81-89c8078a4d96\") " Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.939761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities" (OuterVolumeSpecName: "utilities") pod "58cc2d60-9778-460a-bd81-89c8078a4d96" (UID: "58cc2d60-9778-460a-bd81-89c8078a4d96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:18 crc kubenswrapper[4795]: I0320 17:22:18.948391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d" (OuterVolumeSpecName: "kube-api-access-dff8d") pod "58cc2d60-9778-460a-bd81-89c8078a4d96" (UID: "58cc2d60-9778-460a-bd81-89c8078a4d96"). InnerVolumeSpecName "kube-api-access-dff8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.041134 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dff8d\" (UniqueName: \"kubernetes.io/projected/58cc2d60-9778-460a-bd81-89c8078a4d96-kube-api-access-dff8d\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.041349 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.076351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58cc2d60-9778-460a-bd81-89c8078a4d96" (UID: "58cc2d60-9778-460a-bd81-89c8078a4d96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.142043 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58cc2d60-9778-460a-bd81-89c8078a4d96-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.261386 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b922ea-7281-44f5-b78b-b0ec5d5387d3" path="/var/lib/kubelet/pods/85b922ea-7281-44f5-b78b-b0ec5d5387d3/volumes" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.262099 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26689f8-7057-45ba-8d53-ae4623ecd2e9" path="/var/lib/kubelet/pods/d26689f8-7057-45ba-8d53-ae4623ecd2e9/volumes" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.604897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerStarted","Data":"dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.604941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerStarted","Data":"acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.605212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.607400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2m5ld" event={"ID":"58cc2d60-9778-460a-bd81-89c8078a4d96","Type":"ContainerDied","Data":"d58b7f5b37a6a35ef39d3d8b6ebcad0e2da7e5425eb92dcc45ec68fe40722e18"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.607437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2m5ld" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.607457 4795 scope.go:117] "RemoveContainer" containerID="c35ab86ff89c55002aa81e7c712981fc2a1f900a50a13c618ac66b790346ea8a" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.609054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerStarted","Data":"90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.609090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerStarted","Data":"291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d"} Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.609323 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.613237 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.617024 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.628297 4795 scope.go:117] "RemoveContainer" containerID="ccb42fb76e422903cb8ae67b0af284fad8c19cf7bcd45ce5c5bd37e094afd21b" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.631232 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" podStartSLOduration=3.631217481 podStartE2EDuration="3.631217481s" podCreationTimestamp="2026-03-20 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:19.628366332 +0000 UTC m=+283.086397873" watchObservedRunningTime="2026-03-20 17:22:19.631217481 +0000 UTC m=+283.089249032" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.644615 4795 scope.go:117] "RemoveContainer" containerID="898e9ddb331a961041951b7bb1edfb2abf5db69d1009da036bfe796e8579e1e3" Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.647591 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:22:19 crc kubenswrapper[4795]: I0320 17:22:19.655149 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2m5ld"] Mar 20 17:22:21 crc kubenswrapper[4795]: I0320 17:22:21.173217 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:22:21 crc kubenswrapper[4795]: I0320 17:22:21.204814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" podStartSLOduration=5.204757741 podStartE2EDuration="5.204757741s" podCreationTimestamp="2026-03-20 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:19.684861177 +0000 UTC m=+283.142892738" watchObservedRunningTime="2026-03-20 17:22:21.204757741 +0000 UTC m=+284.662789322" Mar 20 17:22:21 crc kubenswrapper[4795]: I0320 17:22:21.264678 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" path="/var/lib/kubelet/pods/58cc2d60-9778-460a-bd81-89c8078a4d96/volumes" Mar 20 17:22:23 crc kubenswrapper[4795]: I0320 17:22:23.024623 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7"] Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.478000 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.478890 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" containerID="cri-o://90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726" gracePeriod=30 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.573047 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.573791 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" containerID="cri-o://dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc" gracePeriod=30 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.738636 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerID="90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726" exitCode=0 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.738732 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerDied","Data":"90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726"} Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.741943 4795 generic.go:334] "Generic (PLEG): container finished" podID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerID="dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc" exitCode=0 Mar 20 17:22:36 crc kubenswrapper[4795]: I0320 17:22:36.741968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerDied","Data":"dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc"} Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.139845 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.154829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213293 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") pod \"7bfdb898-c35d-488c-9478-4aa41570ca9e\" (UID: \"7bfdb898-c35d-488c-9478-4aa41570ca9e\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.213418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") pod \"5e3c5fd8-2990-4fb9-a8e6-224463172129\" (UID: \"5e3c5fd8-2990-4fb9-a8e6-224463172129\") " Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config" (OuterVolumeSpecName: "config") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config" (OuterVolumeSpecName: "config") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.214834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca" (OuterVolumeSpecName: "client-ca") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215297 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215334 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215359 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215384 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e3c5fd8-2990-4fb9-a8e6-224463172129-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.215410 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bfdb898-c35d-488c-9478-4aa41570ca9e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.218581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm" (OuterVolumeSpecName: "kube-api-access-fh6fm") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "kube-api-access-fh6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.218633 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x" (OuterVolumeSpecName: "kube-api-access-vtd2x") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "kube-api-access-vtd2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.219265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5e3c5fd8-2990-4fb9-a8e6-224463172129" (UID: "5e3c5fd8-2990-4fb9-a8e6-224463172129"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.225719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bfdb898-c35d-488c-9478-4aa41570ca9e" (UID: "7bfdb898-c35d-488c-9478-4aa41570ca9e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316166 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3c5fd8-2990-4fb9-a8e6-224463172129-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316419 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtd2x\" (UniqueName: \"kubernetes.io/projected/5e3c5fd8-2990-4fb9-a8e6-224463172129-kube-api-access-vtd2x\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316431 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bfdb898-c35d-488c-9478-4aa41570ca9e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.316440 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh6fm\" (UniqueName: \"kubernetes.io/projected/7bfdb898-c35d-488c-9478-4aa41570ca9e-kube-api-access-fh6fm\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.750535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" event={"ID":"7bfdb898-c35d-488c-9478-4aa41570ca9e","Type":"ContainerDied","Data":"acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7"} Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.750595 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.750604 4795 scope.go:117] "RemoveContainer" containerID="dd0b9bec1f42791b4352a178de34f831f0e9f607b04c323609ff0b973c5cbcfc" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.753592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" event={"ID":"5e3c5fd8-2990-4fb9-a8e6-224463172129","Type":"ContainerDied","Data":"291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d"} Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.753723 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7844559998-cfclj" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.777858 4795 scope.go:117] "RemoveContainer" containerID="90a7d50899f36dcfd6366fa0b767071e3a607201ea44d9b052839270000be726" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.803985 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.811365 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6876bd5d54-rptsb"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.818241 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.821280 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7844559998-cfclj"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.983645 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz"] Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-content" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984206 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-content" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984312 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-utilities" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984403 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="extract-utilities" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984484 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984625 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984712 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: E0320 17:22:37.984814 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.984908 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985110 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" containerName="controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985201 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="58cc2d60-9778-460a-bd81-89c8078a4d96" containerName="registry-server" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" containerName="route-controller-manager" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.985811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.986326 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz"] Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.990146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.990663 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.990869 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.991322 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.991510 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.991669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.992456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996093 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996409 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996608 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.996766 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:22:37 crc kubenswrapper[4795]: I0320 17:22:37.997323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.001737 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz"] Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.004840 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz"] Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.031452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.131498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-config\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.131835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783215b2-064c-42d0-a523-6f4f9259526a-serving-cert\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.131986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvh9x\" (UniqueName: \"kubernetes.io/projected/783215b2-064c-42d0-a523-6f4f9259526a-kube-api-access-zvh9x\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-client-ca\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tmg\" (UniqueName: \"kubernetes.io/projected/e9495f82-c066-4979-9707-1d0b732dc77c-kube-api-access-m7tmg\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-config\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-client-ca\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9495f82-c066-4979-9707-1d0b732dc77c-serving-cert\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.132857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-proxy-ca-bundles\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.233625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-proxy-ca-bundles\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.233883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-config\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783215b2-064c-42d0-a523-6f4f9259526a-serving-cert\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvh9x\" (UniqueName: \"kubernetes.io/projected/783215b2-064c-42d0-a523-6f4f9259526a-kube-api-access-zvh9x\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-client-ca\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tmg\" (UniqueName: \"kubernetes.io/projected/e9495f82-c066-4979-9707-1d0b732dc77c-kube-api-access-m7tmg\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-config\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-client-ca\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.234555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9495f82-c066-4979-9707-1d0b732dc77c-serving-cert\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-config\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-client-ca\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-proxy-ca-bundles\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9495f82-c066-4979-9707-1d0b732dc77c-client-ca\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.235878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783215b2-064c-42d0-a523-6f4f9259526a-config\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.240462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783215b2-064c-42d0-a523-6f4f9259526a-serving-cert\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.254399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9495f82-c066-4979-9707-1d0b732dc77c-serving-cert\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.258904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvh9x\" (UniqueName: \"kubernetes.io/projected/783215b2-064c-42d0-a523-6f4f9259526a-kube-api-access-zvh9x\") pod \"controller-manager-dcf7c7b76-qwnqz\" (UID: \"783215b2-064c-42d0-a523-6f4f9259526a\") " pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.261493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tmg\" (UniqueName: \"kubernetes.io/projected/e9495f82-c066-4979-9707-1d0b732dc77c-kube-api-access-m7tmg\") pod \"route-controller-manager-595896cb64-tcxtz\" (UID: \"e9495f82-c066-4979-9707-1d0b732dc77c\") " pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.337807 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.345347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.734119 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz"] Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.765102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" event={"ID":"783215b2-064c-42d0-a523-6f4f9259526a","Type":"ContainerStarted","Data":"75b2d9dc239e36a4a908c3bd897a6107015f08e5c989407cb2efb25f3c76567b"} Mar 20 17:22:38 crc kubenswrapper[4795]: I0320 17:22:38.796965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz"] Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.259731 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e3c5fd8-2990-4fb9-a8e6-224463172129" path="/var/lib/kubelet/pods/5e3c5fd8-2990-4fb9-a8e6-224463172129/volumes" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.260920 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfdb898-c35d-488c-9478-4aa41570ca9e" path="/var/lib/kubelet/pods/7bfdb898-c35d-488c-9478-4aa41570ca9e/volumes" Mar 20 17:22:39 crc kubenswrapper[4795]: E0320 17:22:39.638983 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.788283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" event={"ID":"783215b2-064c-42d0-a523-6f4f9259526a","Type":"ContainerStarted","Data":"a52553f6e36343f68d4503456fa55802236a40cab19262e24a90ce7ac65e16c2"} Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.788669 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.790539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" event={"ID":"e9495f82-c066-4979-9707-1d0b732dc77c","Type":"ContainerStarted","Data":"dc7aba5d14ae47e99c998b4f8ebff7603f1c540290bc0c22044158609de9cdd8"} Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.790791 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.790926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" event={"ID":"e9495f82-c066-4979-9707-1d0b732dc77c","Type":"ContainerStarted","Data":"267d1426526b69c78e2efd2918a303fa490695b0de8d306b9636116b0b7735ec"} Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.793054 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.795780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.803523 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dcf7c7b76-qwnqz" podStartSLOduration=3.803510539 podStartE2EDuration="3.803510539s" podCreationTimestamp="2026-03-20 17:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:39.80225323 +0000 UTC m=+303.260284801" watchObservedRunningTime="2026-03-20 17:22:39.803510539 +0000 UTC m=+303.261542080" Mar 20 17:22:39 crc kubenswrapper[4795]: I0320 17:22:39.820635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-595896cb64-tcxtz" podStartSLOduration=3.820619711 podStartE2EDuration="3.820619711s" podCreationTimestamp="2026-03-20 17:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:39.819240758 +0000 UTC m=+303.277272339" watchObservedRunningTime="2026-03-20 17:22:39.820619711 +0000 UTC m=+303.278651252" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.068100 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.069539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.070417 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071062 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071131 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071144 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071178 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.071157 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" gracePeriod=15 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.072667 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.072932 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.072952 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.072980 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.072993 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073011 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073023 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073038 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073050 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073064 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073077 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073107 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073123 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073134 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073149 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073160 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073176 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073187 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.073201 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073421 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073444 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073462 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073534 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073552 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.073932 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.096410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.098014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.098660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100357 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.100496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.136189 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.201834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202174 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.201992 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.202578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.251847 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.252877 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253198 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253377 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253543 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.253561 4795 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.253769 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="200ms" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.423010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:22:42 crc kubenswrapper[4795]: W0320 17:22:42.443201 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f WatchSource:0}: Error finding container c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f: Status 404 returned error can't find the container with id c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.446183 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c7ca864bab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,LastTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.454639 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="400ms" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.822168 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874"} Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.822578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c53e75fe9c1a487613bfd138dd166a448ecc9228923a4a4a51385d837d83193f"} Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.823214 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.823508 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.825927 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.827788 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.828966 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829157 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829303 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829431 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" exitCode=2 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.829052 4795 scope.go:117] "RemoveContainer" containerID="abe7988920d6c7dc323a9ef5cf4a1a376896d61a589becb4a6438bcc2373e7d1" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.832181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerDied","Data":"30207c849aa57355d0d1027a4dded499a63ab8fad8a6d8162ced0752bd75a382"} Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.832195 4795 generic.go:334] "Generic (PLEG): container finished" podID="49054187-cb30-4f07-b67a-794c2503f50a" containerID="30207c849aa57355d0d1027a4dded499a63ab8fad8a6d8162ced0752bd75a382" exitCode=0 Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.833225 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.834024 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.834575 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:42 crc kubenswrapper[4795]: E0320 17:22:42.856052 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="800ms" Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.972444 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 17:22:42 crc kubenswrapper[4795]: I0320 17:22:42.972538 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 17:22:43 crc kubenswrapper[4795]: E0320 17:22:43.066007 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c7ca864bab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,LastTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:22:43 crc kubenswrapper[4795]: E0320 17:22:43.657636 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="1.6s" Mar 20 17:22:43 crc kubenswrapper[4795]: I0320 17:22:43.844245 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.338532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.339300 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.339785 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") pod \"49054187-cb30-4f07-b67a-794c2503f50a\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") pod \"49054187-cb30-4f07-b67a-794c2503f50a\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") pod \"49054187-cb30-4f07-b67a-794c2503f50a\" (UID: \"49054187-cb30-4f07-b67a-794c2503f50a\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock" (OuterVolumeSpecName: "var-lock") pod "49054187-cb30-4f07-b67a-794c2503f50a" (UID: "49054187-cb30-4f07-b67a-794c2503f50a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437729 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "49054187-cb30-4f07-b67a-794c2503f50a" (UID: "49054187-cb30-4f07-b67a-794c2503f50a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437919 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.437934 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49054187-cb30-4f07-b67a-794c2503f50a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.442053 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.443325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.443885 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.444292 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.444644 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.445197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "49054187-cb30-4f07-b67a-794c2503f50a" (UID: "49054187-cb30-4f07-b67a-794c2503f50a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.538963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539064 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539404 4795 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539431 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49054187-cb30-4f07-b67a-794c2503f50a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539451 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.539469 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.856210 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.857749 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" exitCode=0 Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.857814 4795 scope.go:117] "RemoveContainer" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.857917 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.861466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"49054187-cb30-4f07-b67a-794c2503f50a","Type":"ContainerDied","Data":"bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa"} Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.861522 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0b717910d1498d282ddce1e4c125ca4397aad032d825a7ab35bb78e6a401fa" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.861535 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.874934 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.875572 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.876290 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.887906 4795 scope.go:117] "RemoveContainer" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.890592 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.890965 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.891314 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.913205 4795 scope.go:117] "RemoveContainer" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.932849 4795 scope.go:117] "RemoveContainer" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.956499 4795 scope.go:117] "RemoveContainer" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" Mar 20 17:22:44 crc kubenswrapper[4795]: I0320 17:22:44.980521 4795 scope.go:117] "RemoveContainer" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.005992 4795 scope.go:117] "RemoveContainer" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.006350 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\": container with ID starting with 554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b not found: ID does not exist" containerID="554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006382 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b"} err="failed to get container status \"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\": rpc error: code = NotFound desc = could not find container \"554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b\": container with ID starting with 554cfe73ea805cccf143fcf7ccc536b81b1fcd18961c5f25a1814b178d974a6b not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006402 4795 scope.go:117] "RemoveContainer" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.006621 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\": container with ID starting with 730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96 not found: ID does not exist" containerID="730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006644 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96"} err="failed to get container status \"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\": rpc error: code = NotFound desc = could not find container \"730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96\": container with ID starting with 730ced4cd9eb2d583ee4aa405308022fa74e0c4e43b7bff28d2250ad336f6f96 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006660 4795 scope.go:117] "RemoveContainer" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.006917 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\": container with ID starting with 876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94 not found: ID does not exist" containerID="876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006965 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94"} err="failed to get container status \"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\": rpc error: code = NotFound desc = could not find container \"876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94\": container with ID starting with 876f2e3cc50c05ac9764455b31711ed460685067cd84f118962c8fecb94f4e94 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.006979 4795 scope.go:117] "RemoveContainer" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.008241 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\": container with ID starting with 7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018 not found: ID does not exist" containerID="7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008263 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018"} err="failed to get container status \"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\": rpc error: code = NotFound desc = could not find container \"7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018\": container with ID starting with 7b55a335cedbe5289fbe05308e890f47e0828376a3b55f7a8d188bc45a471018 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008278 4795 scope.go:117] "RemoveContainer" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.008611 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\": container with ID starting with f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080 not found: ID does not exist" containerID="f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008636 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080"} err="failed to get container status \"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\": rpc error: code = NotFound desc = could not find container \"f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080\": container with ID starting with f7a2e711955ecf767f22132e3f0b7c551b59e1c1f8548459dcc2f033b4824080 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008649 4795 scope.go:117] "RemoveContainer" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.008919 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\": container with ID starting with 357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782 not found: ID does not exist" containerID="357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.008948 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782"} err="failed to get container status \"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\": rpc error: code = NotFound desc = could not find container \"357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782\": container with ID starting with 357eaf2c9046b49332ac5347136c2b90096ea2adbd7ebd40288c462fcfd5c782 not found: ID does not exist" Mar 20 17:22:45 crc kubenswrapper[4795]: E0320 17:22:45.258183 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="3.2s" Mar 20 17:22:45 crc kubenswrapper[4795]: I0320 17:22:45.262289 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 17:22:47 crc kubenswrapper[4795]: I0320 17:22:47.257608 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:47 crc kubenswrapper[4795]: I0320 17:22:47.258261 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.060250 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" containerID="cri-o://5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" gracePeriod=15 Mar 20 17:22:48 crc kubenswrapper[4795]: E0320 17:22:48.459242 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.58:6443: connect: connection refused" interval="6.4s" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.601940 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.602665 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.603202 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.604003 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708259 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708334 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708578 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.709585 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.708678 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.709937 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.709991 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710249 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") pod \"74d8b767-93df-4c96-a7f0-e7e84ba99380\" (UID: \"74d8b767-93df-4c96-a7f0-e7e84ba99380\") " Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.710311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711841 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711926 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711952 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/74d8b767-93df-4c96-a7f0-e7e84ba99380-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.711973 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.713530 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.718459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.719217 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.719424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs" (OuterVolumeSpecName: "kube-api-access-fdtvs") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "kube-api-access-fdtvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.719944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.720218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.723097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.724572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.725093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.727159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "74d8b767-93df-4c96-a7f0-e7e84ba99380" (UID: "74d8b767-93df-4c96-a7f0-e7e84ba99380"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812469 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812504 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812514 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812523 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812532 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812541 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812552 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812563 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdtvs\" (UniqueName: \"kubernetes.io/projected/74d8b767-93df-4c96-a7f0-e7e84ba99380-kube-api-access-fdtvs\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812571 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.812580 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/74d8b767-93df-4c96-a7f0-e7e84ba99380-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.890468 4795 generic.go:334] "Generic (PLEG): container finished" podID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" exitCode=0 Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.890540 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.890534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerDied","Data":"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f"} Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.891105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" event={"ID":"74d8b767-93df-4c96-a7f0-e7e84ba99380","Type":"ContainerDied","Data":"54c16e287e6b044067d81a5f122f5fce8bd8b850064a731beff318d152b5a0e9"} Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.891130 4795 scope.go:117] "RemoveContainer" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.892545 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.893012 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.893330 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.913808 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.914190 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.914548 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.925084 4795 scope.go:117] "RemoveContainer" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" Mar 20 17:22:48 crc kubenswrapper[4795]: E0320 17:22:48.925550 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f\": container with ID starting with 5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f not found: ID does not exist" containerID="5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f" Mar 20 17:22:48 crc kubenswrapper[4795]: I0320 17:22:48.925591 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f"} err="failed to get container status \"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f\": rpc error: code = NotFound desc = could not find container \"5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f\": container with ID starting with 5c258b3501a7e701785e5935287d7ab00c7696e1e8456a37d6ea0c44fe12333f not found: ID does not exist" Mar 20 17:22:49 crc kubenswrapper[4795]: E0320 17:22:49.783302 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.251500 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.253809 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.254531 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.255166 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.270924 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.270979 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: E0320 17:22:52.271564 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.272179 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: W0320 17:22:52.305905 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d WatchSource:0}: Error finding container f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d: Status 404 returned error can't find the container with id f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.936753 4795 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="12ddd5d234fe00d0e7d99a02a214fba89126a9e4151814b42f99988eaca28de9" exitCode=0 Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.936856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"12ddd5d234fe00d0e7d99a02a214fba89126a9e4151814b42f99988eaca28de9"} Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.937182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5c5cb4337bdece5572d0d0363d6a4a14622a6b48d7933b18c528a4a8c2f2a6d"} Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.937569 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.937601 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:52 crc kubenswrapper[4795]: E0320 17:22:52.938152 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.938175 4795 status_manager.go:851] "Failed to get status for pod" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" pod="openshift-authentication/oauth-openshift-558db77b4-mmtf7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-mmtf7\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.938615 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:52 crc kubenswrapper[4795]: I0320 17:22:52.939061 4795 status_manager.go:851] "Failed to get status for pod" podUID="49054187-cb30-4f07-b67a-794c2503f50a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.58:6443: connect: connection refused" Mar 20 17:22:53 crc kubenswrapper[4795]: E0320 17:22:53.067136 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.58:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c7ca864bab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,LastTimestamp:2026-03-20 17:22:42.445605553 +0000 UTC m=+305.903637104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:22:53 crc kubenswrapper[4795]: I0320 17:22:53.945387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9a2915b230565006626461f0115c0347a53283c27adb636bedd05c3e333e802"} Mar 20 17:22:53 crc kubenswrapper[4795]: I0320 17:22:53.945736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f9affc3d86a0fa9b02e5575739abc74afe9da1997cd31dd1261239364f081bf5"} Mar 20 17:22:53 crc kubenswrapper[4795]: I0320 17:22:53.945746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72f892532c7c5996be453821f507a0511c3cff77c37b060440661f4b51418fa2"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.955846 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.956751 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.956830 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950" exitCode=1 Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.956939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.957604 4795 scope.go:117] "RemoveContainer" containerID="dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d379816cc9a978f0a780f650e77d1ba0087c3e10f99d70a11a2f9f9c3aeea2c3"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bbf2d4f28a6b1053fd498390c982e3dbef480407ed5111fadfd90a0f54641c1b"} Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964555 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964584 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:54 crc kubenswrapper[4795]: I0320 17:22:54.964608 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.745076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.975059 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.976207 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:22:55 crc kubenswrapper[4795]: I0320 17:22:55.976294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520"} Mar 20 17:22:57 crc kubenswrapper[4795]: I0320 17:22:57.272456 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:57 crc kubenswrapper[4795]: I0320 17:22:57.272521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:57 crc kubenswrapper[4795]: I0320 17:22:57.281327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:22:58 crc kubenswrapper[4795]: I0320 17:22:58.385532 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:22:58 crc kubenswrapper[4795]: I0320 17:22:58.386000 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:22:58 crc kubenswrapper[4795]: I0320 17:22:58.386487 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:22:59 crc kubenswrapper[4795]: E0320 17:22:59.923637 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:59 crc kubenswrapper[4795]: I0320 17:22:59.974074 4795 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.009600 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.009658 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.018788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:23:00 crc kubenswrapper[4795]: I0320 17:23:00.117189 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3f481f95-4236-49c0-a819-66b5416cb925" Mar 20 17:23:01 crc kubenswrapper[4795]: I0320 17:23:01.015768 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:01 crc kubenswrapper[4795]: I0320 17:23:01.015826 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="22f4248a-e081-4e6f-9dde-6641fb13d92f" Mar 20 17:23:01 crc kubenswrapper[4795]: I0320 17:23:01.018902 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="3f481f95-4236-49c0-a819-66b5416cb925" Mar 20 17:23:02 crc kubenswrapper[4795]: I0320 17:23:02.956485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.226517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.226973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.227130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.230168 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.230203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.230940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.239132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.239318 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.253356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.253944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.330833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.337029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.375980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.391316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:23:04 crc kubenswrapper[4795]: I0320 17:23:04.404170 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:04 crc kubenswrapper[4795]: W0320 17:23:04.932058 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f WatchSource:0}: Error finding container f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f: Status 404 returned error can't find the container with id f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f Mar 20 17:23:05 crc kubenswrapper[4795]: W0320 17:23:05.007451 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9 WatchSource:0}: Error finding container 67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9: Status 404 returned error can't find the container with id 67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9 Mar 20 17:23:05 crc kubenswrapper[4795]: I0320 17:23:05.050551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"67f03fa1ae0e48ed154c1dbdb6137d350a3b5a11d1953bc27ca239ec9c3611d9"} Mar 20 17:23:05 crc kubenswrapper[4795]: I0320 17:23:05.052450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f14ed71a174f01c8546b89745e1a1f61c02f8af9d66f6e9ada2ca45f28cda98f"} Mar 20 17:23:05 crc kubenswrapper[4795]: W0320 17:23:05.071208 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f WatchSource:0}: Error finding container d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f: Status 404 returned error can't find the container with id d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.047894 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.064124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6771992b61542ed1ebd988c425bef3156fac5bdb7fe3d5365ed4ee0e5a60b3c7"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.064194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d3e37295410ee8bbc13aa5e3326b77ccf092bd502d694e43da521303630a866f"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.064404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.066970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.070736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ba02a145c138c54849e78cb97f82c698845800b8de0805be5c6d93c948a71de9"} Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.224269 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.666878 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.667129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.733285 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:23:06 crc kubenswrapper[4795]: I0320 17:23:06.975904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.079803 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.079885 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad" exitCode=255 Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.079990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad"} Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.080590 4795 scope.go:117] "RemoveContainer" containerID="231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.597378 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.647291 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.687749 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.924715 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:23:07 crc kubenswrapper[4795]: I0320 17:23:07.942033 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.054065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.087123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.087182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707"} Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.188259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.217388 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.254423 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.281349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.284643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.299708 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.378780 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.386862 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.386960 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.544195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.641562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:23:08 crc kubenswrapper[4795]: I0320 17:23:08.778452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.007834 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.089895 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.095283 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.095998 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096074 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" exitCode=255 Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707"} Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096203 4795 scope.go:117] "RemoveContainer" containerID="231e8936baa065c0b973a2f2518b4056e1234d1e287dc1cd623692b19a4a16ad" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.096744 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:09 crc kubenswrapper[4795]: E0320 17:23:09.097165 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.206494 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.240156 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.441805 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.482670 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.634974 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.706975 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:23:09 crc kubenswrapper[4795]: I0320 17:23:09.863540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:23:10 crc kubenswrapper[4795]: E0320 17:23:10.073777 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.074576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.105266 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.106247 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:10 crc kubenswrapper[4795]: E0320 17:23:10.106590 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.157885 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.161943 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=28.161917482 podStartE2EDuration="28.161917482s" podCreationTimestamp="2026-03-20 17:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:23:00.037671249 +0000 UTC m=+323.495702790" watchObservedRunningTime="2026-03-20 17:23:10.161917482 +0000 UTC m=+333.619949073" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.166042 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mmtf7","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.166196 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.173261 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.173905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.194850 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.19482764 podStartE2EDuration="11.19482764s" podCreationTimestamp="2026-03-20 17:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:23:10.189490619 +0000 UTC m=+333.647522230" watchObservedRunningTime="2026-03-20 17:23:10.19482764 +0000 UTC m=+333.652859211" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.269765 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.516586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.526513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.638568 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.656120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:23:10 crc kubenswrapper[4795]: I0320 17:23:10.662514 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.006212 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.006561 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874" gracePeriod=5 Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.265786 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" path="/var/lib/kubelet/pods/74d8b767-93df-4c96-a7f0-e7e84ba99380/volumes" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.294213 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.642562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.661865 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.681523 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.765248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:23:11 crc kubenswrapper[4795]: I0320 17:23:11.841277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:23:12 crc kubenswrapper[4795]: I0320 17:23:12.322644 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:23:12 crc kubenswrapper[4795]: I0320 17:23:12.645905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:23:12 crc kubenswrapper[4795]: I0320 17:23:12.773911 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:23:13 crc kubenswrapper[4795]: I0320 17:23:13.532045 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:23:13 crc kubenswrapper[4795]: I0320 17:23:13.656751 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.100077 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.170537 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.289224 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.392624 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.437618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.669869 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:23:14 crc kubenswrapper[4795]: I0320 17:23:14.905399 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.128631 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.209482 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.282186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.308881 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.361576 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.455597 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.470555 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.508032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.599839 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.656422 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.712151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:23:15 crc kubenswrapper[4795]: I0320 17:23:15.782912 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.076640 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.129757 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.150835 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.150931 4795 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874" exitCode=137 Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.209362 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.257156 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.597223 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.628168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.628675 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.698894 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801941 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.801963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802401 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802432 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802452 4795 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.802472 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.817452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.903292 4795 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:23:16 crc kubenswrapper[4795]: I0320 17:23:16.965989 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.161344 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.161475 4795 scope.go:117] "RemoveContainer" containerID="02a472e27a9df1f2394993beadd98fb0933889c8dc6d19ca5759e2e2012a7874" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.161773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.188479 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.258634 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.264523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.265019 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.285302 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.285353 4795 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a3302709-36e6-471a-b69d-33af908a64cd" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.293544 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.294226 4795 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a3302709-36e6-471a-b69d-33af908a64cd" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.294492 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.315675 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.333429 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.402313 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.414382 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.437254 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.470929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.699810 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.735314 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.747723 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.799440 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.849174 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:23:17 crc kubenswrapper[4795]: I0320 17:23:17.986353 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.017466 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.057288 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.068939 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.077766 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.084974 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.094294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.387124 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.387218 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.387294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.388272 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.388470 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520" gracePeriod=30 Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.413366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.421562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.494141 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.515587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.583613 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.642335 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.667345 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.735363 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.757929 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.800362 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.861822 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.902530 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:23:18 crc kubenswrapper[4795]: I0320 17:23:18.953524 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.012851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.027468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.139756 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.547818 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.649173 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.649543 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.868432 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.919434 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:23:19 crc kubenswrapper[4795]: I0320 17:23:19.981973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.102752 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.114920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:23:20 crc kubenswrapper[4795]: E0320 17:23:20.244690 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache]" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.288139 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.365256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.458523 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.525357 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.531492 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.540772 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.666562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.690411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.714116 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.715297 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:23:20 crc kubenswrapper[4795]: I0320 17:23:20.821918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.015380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.082434 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.177412 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.195660 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.259007 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.297296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.297645 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.586072 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.751562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.769946 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.779585 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:23:21 crc kubenswrapper[4795]: I0320 17:23:21.827174 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.047218 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.124018 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.214614 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.254049 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.303056 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.310205 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.415876 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.656851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.657742 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.669294 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.680329 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.712809 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.793491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.807804 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:23:22 crc kubenswrapper[4795]: I0320 17:23:22.965314 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.061738 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.066571 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.097561 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.239274 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.252469 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.332501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.338773 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.344114 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.365722 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.449154 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.474252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.504275 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.564349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.579854 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.581119 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.728265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.760045 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.827648 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:23:23 crc kubenswrapper[4795]: I0320 17:23:23.922597 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.023370 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.090954 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.217587 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.217672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201"} Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.280934 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.350028 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.440346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.540724 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.585002 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.619578 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.672263 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.724513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.760711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.857630 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.913573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.951829 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:23:24 crc kubenswrapper[4795]: I0320 17:23:24.959871 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.081863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228005 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228720 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228799 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" exitCode=255 Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201"} Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.228926 4795 scope.go:117] "RemoveContainer" containerID="b5a23266736a7e764bb1f2d5a5550ecc22eb650d345e2c39f4c6ed2395e46707" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.230458 4795 scope.go:117] "RemoveContainer" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" Mar 20 17:23:25 crc kubenswrapper[4795]: E0320 17:23:25.230834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.270195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.320739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.342448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.437161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.466670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.595359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.633424 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.671932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.722494 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.743421 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.836990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.908282 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:23:25 crc kubenswrapper[4795]: I0320 17:23:25.931313 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.006040 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.235826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.238277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.253295 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.397518 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.525625 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.725359 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:23:26 crc kubenswrapper[4795]: I0320 17:23:26.763200 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.129612 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.290726 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75d6949b4b-gmjkq"] Mar 20 17:23:27 crc kubenswrapper[4795]: E0320 17:23:27.291114 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291143 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:23:27 crc kubenswrapper[4795]: E0320 17:23:27.291181 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49054187-cb30-4f07-b67a-794c2503f50a" containerName="installer" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291197 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="49054187-cb30-4f07-b67a-794c2503f50a" containerName="installer" Mar 20 17:23:27 crc kubenswrapper[4795]: E0320 17:23:27.291230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291247 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291458 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291493 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d8b767-93df-4c96-a7f0-e7e84ba99380" containerName="oauth-openshift" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.291521 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="49054187-cb30-4f07-b67a-794c2503f50a" containerName="installer" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.292296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.296228 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.296264 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.296517 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.297126 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.297196 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.297503 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.298037 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.298762 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.300848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.301075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.303564 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.303817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.309375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.317435 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75d6949b4b-gmjkq"] Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.320243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.322273 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.325548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.329157 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364594 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364655 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-login\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-error\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.364819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-audit-policies\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365111 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dc2\" (UniqueName: \"kubernetes.io/projected/85ae2267-12da-42ef-8382-75d6aa39b954-kube-api-access-t6dc2\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85ae2267-12da-42ef-8382-75d6aa39b954-audit-dir\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.365663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-session\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.377851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-audit-policies\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466461 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dc2\" (UniqueName: \"kubernetes.io/projected/85ae2267-12da-42ef-8382-75d6aa39b954-kube-api-access-t6dc2\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466556 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85ae2267-12da-42ef-8382-75d6aa39b954-audit-dir\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-session\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-login\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-error\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.466808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.467645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85ae2267-12da-42ef-8382-75d6aa39b954-audit-dir\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.468056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.468430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-audit-policies\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.469021 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.469560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-service-ca\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.474673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.474761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-error\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.476141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.476855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-template-login\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.476885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-session\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.477027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-router-certs\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.477376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.478217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85ae2267-12da-42ef-8382-75d6aa39b954-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.495969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dc2\" (UniqueName: \"kubernetes.io/projected/85ae2267-12da-42ef-8382-75d6aa39b954-kube-api-access-t6dc2\") pod \"oauth-openshift-75d6949b4b-gmjkq\" (UID: \"85ae2267-12da-42ef-8382-75d6aa39b954\") " pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.549544 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.625919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:27 crc kubenswrapper[4795]: I0320 17:23:27.930152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.106341 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.115512 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75d6949b4b-gmjkq"] Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.254376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" event={"ID":"85ae2267-12da-42ef-8382-75d6aa39b954","Type":"ContainerStarted","Data":"c56de0fe70fac295a9491c34522e6f93fe547c373e6e173289b0df1ab2d61004"} Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.324735 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.434075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:23:28 crc kubenswrapper[4795]: I0320 17:23:28.590360 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.006682 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.059455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.263216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.263264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" event={"ID":"85ae2267-12da-42ef-8382-75d6aa39b954","Type":"ContainerStarted","Data":"b07f42b1bf447b1e61806e312444c517c0c6f0dcde6496eb41887f0ceaaae57e"} Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.268183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.323803 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75d6949b4b-gmjkq" podStartSLOduration=66.323781343 podStartE2EDuration="1m6.323781343s" podCreationTimestamp="2026-03-20 17:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:23:29.292059063 +0000 UTC m=+352.750090604" watchObservedRunningTime="2026-03-20 17:23:29.323781343 +0000 UTC m=+352.781812884" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.342200 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.465149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.683139 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.745410 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.849994 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:23:29 crc kubenswrapper[4795]: I0320 17:23:29.896517 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:23:30 crc kubenswrapper[4795]: E0320 17:23:30.399809 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice/crio-acee0df29b84ba13799cafbb54ab6b095cbd1b06a3b5d311504befc6e3a95aa7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfdb898_c35d_488c_9478_4aa41570ca9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice/crio-291fe9b1b8a52dc80d3dc342dd6b5c362cef6483c138538d0955fde02463959d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c5fd8_2990_4fb9_a8e6_224463172129.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:23:30 crc kubenswrapper[4795]: I0320 17:23:30.998426 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:23:37 crc kubenswrapper[4795]: E0320 17:23:37.286192 4795 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/a28636b1ecd20bb4083afbd5ef8fb21bddc7f459d75a0207e71de4fce8d42ee9/diff" to get inode usage: stat /var/lib/containers/storage/overlay/a28636b1ecd20bb4083afbd5ef8fb21bddc7f459d75a0207e71de4fce8d42ee9/diff: no such file or directory, extraDiskErr: Mar 20 17:23:37 crc kubenswrapper[4795]: E0320 17:23:37.286355 4795 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/2873f9ef1dc8861ab4921f68e7ee5d71811d6d581545e885e23b8f85cb0879d5/diff" to get inode usage: stat /var/lib/containers/storage/overlay/2873f9ef1dc8861ab4921f68e7ee5d71811d6d581545e885e23b8f85cb0879d5/diff: no such file or directory, extraDiskErr: Mar 20 17:23:39 crc kubenswrapper[4795]: I0320 17:23:39.253018 4795 scope.go:117] "RemoveContainer" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" Mar 20 17:23:39 crc kubenswrapper[4795]: E0320 17:23:39.253645 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:23:44 crc kubenswrapper[4795]: I0320 17:23:44.410733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.396561 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.399862 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400880 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520" exitCode=137 Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d4d13b4f6f960995829a9e21a67c7d3e0b19cec5cac56f814725866cead5f520"} Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f959cd95b7529d39a7d10c18164d045af09efa73e7189e024e528bd75f24eb45"} Mar 20 17:23:49 crc kubenswrapper[4795]: I0320 17:23:49.400990 4795 scope.go:117] "RemoveContainer" containerID="dc7519be4f122bc7d20b56ba54ba210f4722a8b346c64093a81f4d4eeb25c950" Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.409012 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" exitCode=0 Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.409073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerDied","Data":"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe"} Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.410372 4795 scope.go:117] "RemoveContainer" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.416865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 17:23:50 crc kubenswrapper[4795]: I0320 17:23:50.420783 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:23:51 crc kubenswrapper[4795]: I0320 17:23:51.429417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerStarted","Data":"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f"} Mar 20 17:23:51 crc kubenswrapper[4795]: I0320 17:23:51.430739 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:23:51 crc kubenswrapper[4795]: I0320 17:23:51.434515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.252007 4795 scope.go:117] "RemoveContainer" containerID="25e90138d095fe66d3ab62c39626633fbe89b02432e1c98cedb697e6d48a6201" Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.451536 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.452254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b2d95a75f92321d727c05b3f6ba6a7835679bffdbaf8404b31b93e6e5deb82aa"} Mar 20 17:23:52 crc kubenswrapper[4795]: I0320 17:23:52.955578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:58 crc kubenswrapper[4795]: I0320 17:23:58.386600 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:23:58 crc kubenswrapper[4795]: I0320 17:23:58.394782 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:24:02 crc kubenswrapper[4795]: I0320 17:24:02.966771 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.047978 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.049122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.050592 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.050905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.051078 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.068607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.182203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"auto-csr-approver-29567124-wjlwc\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.283734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"auto-csr-approver-29567124-wjlwc\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.303878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"auto-csr-approver-29567124-wjlwc\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.363379 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:09 crc kubenswrapper[4795]: I0320 17:24:09.829246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:24:10 crc kubenswrapper[4795]: I0320 17:24:10.592861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" event={"ID":"35c14395-0a4c-47be-8f64-382e60e3faad","Type":"ContainerStarted","Data":"152feee1511bbc3ac328d2740d0920b24a83751d1d4c0a4e97f0c8bed6a7d401"} Mar 20 17:24:11 crc kubenswrapper[4795]: I0320 17:24:11.600432 4795 generic.go:334] "Generic (PLEG): container finished" podID="35c14395-0a4c-47be-8f64-382e60e3faad" containerID="e208a8a62ce5332bce059cfe9498a63b10989e2ede473bf8237789de0f3da7f0" exitCode=0 Mar 20 17:24:11 crc kubenswrapper[4795]: I0320 17:24:11.600488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" event={"ID":"35c14395-0a4c-47be-8f64-382e60e3faad","Type":"ContainerDied","Data":"e208a8a62ce5332bce059cfe9498a63b10989e2ede473bf8237789de0f3da7f0"} Mar 20 17:24:12 crc kubenswrapper[4795]: I0320 17:24:12.958630 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.032148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") pod \"35c14395-0a4c-47be-8f64-382e60e3faad\" (UID: \"35c14395-0a4c-47be-8f64-382e60e3faad\") " Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.039803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4" (OuterVolumeSpecName: "kube-api-access-x59l4") pod "35c14395-0a4c-47be-8f64-382e60e3faad" (UID: "35c14395-0a4c-47be-8f64-382e60e3faad"). InnerVolumeSpecName "kube-api-access-x59l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.133568 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x59l4\" (UniqueName: \"kubernetes.io/projected/35c14395-0a4c-47be-8f64-382e60e3faad-kube-api-access-x59l4\") on node \"crc\" DevicePath \"\"" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.618514 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" event={"ID":"35c14395-0a4c-47be-8f64-382e60e3faad","Type":"ContainerDied","Data":"152feee1511bbc3ac328d2740d0920b24a83751d1d4c0a4e97f0c8bed6a7d401"} Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.618956 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152feee1511bbc3ac328d2740d0920b24a83751d1d4c0a4e97f0c8bed6a7d401" Mar 20 17:24:13 crc kubenswrapper[4795]: I0320 17:24:13.618588 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-wjlwc" Mar 20 17:24:41 crc kubenswrapper[4795]: I0320 17:24:41.300447 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:24:41 crc kubenswrapper[4795]: I0320 17:24:41.301822 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.230359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.231063 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kzvch" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" containerID="cri-o://c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.248465 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.248837 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk5rk" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" containerID="cri-o://a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.255915 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.256864 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" containerID="cri-o://b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.274499 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.274884 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ht4zv" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" containerID="cri-o://1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.283596 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8x76m"] Mar 20 17:25:00 crc kubenswrapper[4795]: E0320 17:25:00.283966 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" containerName="oc" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.283986 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" containerName="oc" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.284189 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" containerName="oc" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.284800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.290516 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.290815 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q7czt" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" containerID="cri-o://5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" gracePeriod=30 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.298059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8x76m"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.379788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.379833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntxg\" (UniqueName: \"kubernetes.io/projected/a2de2777-57e1-4310-a878-1cfc1fc77e44-kube-api-access-tntxg\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.379876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.480767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.480821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tntxg\" (UniqueName: \"kubernetes.io/projected/a2de2777-57e1-4310-a878-1cfc1fc77e44-kube-api-access-tntxg\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.480868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.482697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.493732 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2de2777-57e1-4310-a878-1cfc1fc77e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.499232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tntxg\" (UniqueName: \"kubernetes.io/projected/a2de2777-57e1-4310-a878-1cfc1fc77e44-kube-api-access-tntxg\") pod \"marketplace-operator-79b997595-8x76m\" (UID: \"a2de2777-57e1-4310-a878-1cfc1fc77e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.547968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8jcpg"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.548549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.578044 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8jcpg"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.600591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.639286 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690838 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d9597a8-43b0-4f3e-adb9-5f0d32479431-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-tls\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-trusted-ca\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-bound-sa-token\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.690996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5c2p\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-kube-api-access-n5c2p\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.691023 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d9597a8-43b0-4f3e-adb9-5f0d32479431-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.691045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-certificates\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.695024 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.703258 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.751277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") pod \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") pod \"57849322-f280-42ee-a330-18120aeed5db\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") pod \"ed1a790f-ddf0-4512-88c5-dba972460e8a\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") pod \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807323 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") pod \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\" (UID: \"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") pod \"57849322-f280-42ee-a330-18120aeed5db\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") pod \"ed1a790f-ddf0-4512-88c5-dba972460e8a\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") pod \"57849322-f280-42ee-a330-18120aeed5db\" (UID: \"57849322-f280-42ee-a330-18120aeed5db\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.807511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") pod \"ed1a790f-ddf0-4512-88c5-dba972460e8a\" (UID: \"ed1a790f-ddf0-4512-88c5-dba972460e8a\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5c2p\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-kube-api-access-n5c2p\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d9597a8-43b0-4f3e-adb9-5f0d32479431-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808077 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-certificates\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d9597a8-43b0-4f3e-adb9-5f0d32479431-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-tls\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-trusted-ca\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.808194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-bound-sa-token\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.809761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities" (OuterVolumeSpecName: "utilities") pod "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" (UID: "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.810954 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.811269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-certificates\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.811900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ed1a790f-ddf0-4512-88c5-dba972460e8a" (UID: "ed1a790f-ddf0-4512-88c5-dba972460e8a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.812165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d9597a8-43b0-4f3e-adb9-5f0d32479431-trusted-ca\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.814351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4d9597a8-43b0-4f3e-adb9-5f0d32479431-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.814404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities" (OuterVolumeSpecName: "utilities") pod "57849322-f280-42ee-a330-18120aeed5db" (UID: "57849322-f280-42ee-a330-18120aeed5db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.831989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4d9597a8-43b0-4f3e-adb9-5f0d32479431-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.834782 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk" (OuterVolumeSpecName: "kube-api-access-4cdbk") pod "ed1a790f-ddf0-4512-88c5-dba972460e8a" (UID: "ed1a790f-ddf0-4512-88c5-dba972460e8a"). InnerVolumeSpecName "kube-api-access-4cdbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.841573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ed1a790f-ddf0-4512-88c5-dba972460e8a" (UID: "ed1a790f-ddf0-4512-88c5-dba972460e8a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc" (OuterVolumeSpecName: "kube-api-access-w2vjc") pod "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" (UID: "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f"). InnerVolumeSpecName "kube-api-access-w2vjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-registry-tls\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-bound-sa-token\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd" (OuterVolumeSpecName: "kube-api-access-x6zsd") pod "57849322-f280-42ee-a330-18120aeed5db" (UID: "57849322-f280-42ee-a330-18120aeed5db"). InnerVolumeSpecName "kube-api-access-x6zsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.859975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5c2p\" (UniqueName: \"kubernetes.io/projected/4d9597a8-43b0-4f3e-adb9-5f0d32479431-kube-api-access-n5c2p\") pod \"image-registry-66df7c8f76-8jcpg\" (UID: \"4d9597a8-43b0-4f3e-adb9-5f0d32479431\") " pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.873394 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.896375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57849322-f280-42ee-a330-18120aeed5db" (UID: "57849322-f280-42ee-a330-18120aeed5db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.904697 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" (UID: "fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") pod \"70000016-e928-4b11-a31d-4d08e9450a1c\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") pod \"70000016-e928-4b11-a31d-4d08e9450a1c\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") pod \"70000016-e928-4b11-a31d-4d08e9450a1c\" (UID: \"70000016-e928-4b11-a31d-4d08e9450a1c\") " Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908906 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908919 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908927 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed1a790f-ddf0-4512-88c5-dba972460e8a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908936 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908944 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57849322-f280-42ee-a330-18120aeed5db-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cdbk\" (UniqueName: \"kubernetes.io/projected/ed1a790f-ddf0-4512-88c5-dba972460e8a-kube-api-access-4cdbk\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908961 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2vjc\" (UniqueName: \"kubernetes.io/projected/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-kube-api-access-w2vjc\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908968 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.908977 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6zsd\" (UniqueName: \"kubernetes.io/projected/57849322-f280-42ee-a330-18120aeed5db-kube-api-access-x6zsd\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.909324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities" (OuterVolumeSpecName: "utilities") pod "70000016-e928-4b11-a31d-4d08e9450a1c" (UID: "70000016-e928-4b11-a31d-4d08e9450a1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.912180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8" (OuterVolumeSpecName: "kube-api-access-sgvm8") pod "70000016-e928-4b11-a31d-4d08e9450a1c" (UID: "70000016-e928-4b11-a31d-4d08e9450a1c"). InnerVolumeSpecName "kube-api-access-sgvm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926520 4795 generic.go:334] "Generic (PLEG): container finished" podID="57849322-f280-42ee-a330-18120aeed5db" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926594 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5rk" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5rk" event={"ID":"57849322-f280-42ee-a330-18120aeed5db","Type":"ContainerDied","Data":"333adeb9b81abd47208fc6ec71e454bad1f18be9356efa101b49dd2d5983cc19"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.926630 4795 scope.go:117] "RemoveContainer" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931494 4795 generic.go:334] "Generic (PLEG): container finished" podID="70000016-e928-4b11-a31d-4d08e9450a1c" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4zv" event={"ID":"70000016-e928-4b11-a31d-4d08e9450a1c","Type":"ContainerDied","Data":"37c22f0e8db69278ef99884d66f0d1b39626955adbad846c6823797b6df30257"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.931625 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4zv" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.932441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70000016-e928-4b11-a31d-4d08e9450a1c" (UID: "70000016-e928-4b11-a31d-4d08e9450a1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933804 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerDied","Data":"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" event={"ID":"ed1a790f-ddf0-4512-88c5-dba972460e8a","Type":"ContainerDied","Data":"10ac9aefe8ac1466c7fac8993e74ddbafb9c6821332b48f3d05657ff9290f6e5"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.933945 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-clvzs" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944190 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kzvch" event={"ID":"fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f","Type":"ContainerDied","Data":"352f21e959b8a9617f62fdaa474337c620b65ea35de203e2a6258d4f6ab66557"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.944281 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kzvch" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946491 4795 generic.go:334] "Generic (PLEG): container finished" podID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" exitCode=0 Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q7czt" event={"ID":"73dd05f7-2cc4-4a99-b12d-26e4d436acca","Type":"ContainerDied","Data":"d21b253fa758e914360b02dd8aa7261d5b383defcc69cfc8b102952a167fd840"} Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946615 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q7czt" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.946809 4795 scope.go:117] "RemoveContainer" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.961257 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.966958 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk5rk"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.970434 4795 scope.go:117] "RemoveContainer" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.984972 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.991878 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-clvzs"] Mar 20 17:25:00 crc kubenswrapper[4795]: I0320 17:25:00.995030 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.001876 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kzvch"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.008461 4795 scope.go:117] "RemoveContainer" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.008979 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233\": container with ID starting with a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233 not found: ID does not exist" containerID="a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009006 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233"} err="failed to get container status \"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233\": rpc error: code = NotFound desc = could not find container \"a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233\": container with ID starting with a04dea57598b1293253e646310cd442a1dfba04945bcbbbf34414cc9b732e233 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009027 4795 scope.go:117] "RemoveContainer" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.009374 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f\": container with ID starting with 36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f not found: ID does not exist" containerID="36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009393 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f"} err="failed to get container status \"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f\": rpc error: code = NotFound desc = could not find container \"36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f\": container with ID starting with 36c24a0e52e22487ddf8d55fd15cd389a33d357a79d9f7b03e7f1bc76484cf7f not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009405 4795 scope.go:117] "RemoveContainer" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009558 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") pod \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") pod \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.009671 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1\": container with ID starting with cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1 not found: ID does not exist" containerID="cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009880 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1"} err="failed to get container status \"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1\": rpc error: code = NotFound desc = could not find container \"cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1\": container with ID starting with cbb98a7162dc32310fa951b843716f75317a2d5bb3dcde252797b922f313cae1 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009925 4795 scope.go:117] "RemoveContainer" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.009844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") pod \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\" (UID: \"73dd05f7-2cc4-4a99-b12d-26e4d436acca\") " Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010459 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010480 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgvm8\" (UniqueName: \"kubernetes.io/projected/70000016-e928-4b11-a31d-4d08e9450a1c-kube-api-access-sgvm8\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010508 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70000016-e928-4b11-a31d-4d08e9450a1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.010765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities" (OuterVolumeSpecName: "utilities") pod "73dd05f7-2cc4-4a99-b12d-26e4d436acca" (UID: "73dd05f7-2cc4-4a99-b12d-26e4d436acca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.012829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb" (OuterVolumeSpecName: "kube-api-access-rhxfb") pod "73dd05f7-2cc4-4a99-b12d-26e4d436acca" (UID: "73dd05f7-2cc4-4a99-b12d-26e4d436acca"). InnerVolumeSpecName "kube-api-access-rhxfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.024381 4795 scope.go:117] "RemoveContainer" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.035721 4795 scope.go:117] "RemoveContainer" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.063558 4795 scope.go:117] "RemoveContainer" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.065598 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d\": container with ID starting with 1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d not found: ID does not exist" containerID="1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.065633 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d"} err="failed to get container status \"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d\": rpc error: code = NotFound desc = could not find container \"1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d\": container with ID starting with 1010384f39375504a08aeb061beb1b6449f321f8396b3036fda1c22e3d6d876d not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.065654 4795 scope.go:117] "RemoveContainer" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.066036 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e\": container with ID starting with 6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e not found: ID does not exist" containerID="6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066051 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e"} err="failed to get container status \"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e\": rpc error: code = NotFound desc = could not find container \"6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e\": container with ID starting with 6530bf3f2dd6eea7a8f95279c5330a0d3cbea26fd3f2f920ef8b7e287f16305e not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066064 4795 scope.go:117] "RemoveContainer" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.066421 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25\": container with ID starting with 9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25 not found: ID does not exist" containerID="9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066484 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25"} err="failed to get container status \"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25\": rpc error: code = NotFound desc = could not find container \"9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25\": container with ID starting with 9965cf361c4eea187c97ddd65767ffb9b854edd52cdb087fefa769b600216f25 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.066519 4795 scope.go:117] "RemoveContainer" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.083368 4795 scope.go:117] "RemoveContainer" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.098852 4795 scope.go:117] "RemoveContainer" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.099236 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f\": container with ID starting with b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f not found: ID does not exist" containerID="b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099267 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f"} err="failed to get container status \"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f\": rpc error: code = NotFound desc = could not find container \"b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f\": container with ID starting with b8140590dfad77e4995157012eae5e154b636c26ae92f20663a9ecfe9ad5f07f not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099288 4795 scope.go:117] "RemoveContainer" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.099581 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe\": container with ID starting with c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe not found: ID does not exist" containerID="c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099626 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe"} err="failed to get container status \"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe\": rpc error: code = NotFound desc = could not find container \"c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe\": container with ID starting with c08d461dbcc16715221446ea514e13c9e0f28fedd91ff6786f66e53e87ab27fe not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.099659 4795 scope.go:117] "RemoveContainer" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.112575 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.112600 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhxfb\" (UniqueName: \"kubernetes.io/projected/73dd05f7-2cc4-4a99-b12d-26e4d436acca-kube-api-access-rhxfb\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.123822 4795 scope.go:117] "RemoveContainer" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.161824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8x76m"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.162847 4795 scope.go:117] "RemoveContainer" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" Mar 20 17:25:01 crc kubenswrapper[4795]: W0320 17:25:01.174274 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2de2777_57e1_4310_a878_1cfc1fc77e44.slice/crio-f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e WatchSource:0}: Error finding container f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e: Status 404 returned error can't find the container with id f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.180942 4795 scope.go:117] "RemoveContainer" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.182616 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1\": container with ID starting with c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1 not found: ID does not exist" containerID="c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.182653 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1"} err="failed to get container status \"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1\": rpc error: code = NotFound desc = could not find container \"c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1\": container with ID starting with c08c867dec19cfac3c18aefcfc16d1ece051bf88c2d9de7ef126fd45bc4722f1 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.182703 4795 scope.go:117] "RemoveContainer" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.183497 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741\": container with ID starting with 824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741 not found: ID does not exist" containerID="824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.183546 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741"} err="failed to get container status \"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741\": rpc error: code = NotFound desc = could not find container \"824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741\": container with ID starting with 824c1f80f4545ea3d21c288533b0067d2955ddfc9989de40a611140ab5580741 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.183581 4795 scope.go:117] "RemoveContainer" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.183968 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031\": container with ID starting with 63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031 not found: ID does not exist" containerID="63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.183992 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031"} err="failed to get container status \"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031\": rpc error: code = NotFound desc = could not find container \"63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031\": container with ID starting with 63ae4a4bbf4b44f859c356b4431c475213d7c12457222fe8123b3085226b7031 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.184007 4795 scope.go:117] "RemoveContainer" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.185656 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8jcpg"] Mar 20 17:25:01 crc kubenswrapper[4795]: W0320 17:25:01.188831 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9597a8_43b0_4f3e_adb9_5f0d32479431.slice/crio-5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c WatchSource:0}: Error finding container 5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c: Status 404 returned error can't find the container with id 5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.193394 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73dd05f7-2cc4-4a99-b12d-26e4d436acca" (UID: "73dd05f7-2cc4-4a99-b12d-26e4d436acca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.195945 4795 scope.go:117] "RemoveContainer" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.211953 4795 scope.go:117] "RemoveContainer" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.213280 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73dd05f7-2cc4-4a99-b12d-26e4d436acca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.227281 4795 scope.go:117] "RemoveContainer" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.227952 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c\": container with ID starting with 5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c not found: ID does not exist" containerID="5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228004 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c"} err="failed to get container status \"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c\": rpc error: code = NotFound desc = could not find container \"5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c\": container with ID starting with 5a9ca90e946a89299f5ea7e44d6c7070096545a9b93d760973ab639601d8397c not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228037 4795 scope.go:117] "RemoveContainer" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.228392 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa\": container with ID starting with 6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa not found: ID does not exist" containerID="6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228432 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa"} err="failed to get container status \"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa\": rpc error: code = NotFound desc = could not find container \"6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa\": container with ID starting with 6eff54a68f437b957e50d490ce1e1b228099c64036d8c31592cba7133c4c13aa not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228459 4795 scope.go:117] "RemoveContainer" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" Mar 20 17:25:01 crc kubenswrapper[4795]: E0320 17:25:01.228797 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394\": container with ID starting with 6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394 not found: ID does not exist" containerID="6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.228817 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394"} err="failed to get container status \"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394\": rpc error: code = NotFound desc = could not find container \"6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394\": container with ID starting with 6648631587d4dd8ae23aaea82a099f7cb1fb2f04fceaceb1cb97f80e9f3c0394 not found: ID does not exist" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.269730 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57849322-f280-42ee-a330-18120aeed5db" path="/var/lib/kubelet/pods/57849322-f280-42ee-a330-18120aeed5db/volumes" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.272319 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" path="/var/lib/kubelet/pods/ed1a790f-ddf0-4512-88c5-dba972460e8a/volumes" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.274485 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" path="/var/lib/kubelet/pods/fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f/volumes" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.282425 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.282460 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4zv"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.282478 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.285255 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q7czt"] Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.956994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" event={"ID":"a2de2777-57e1-4310-a878-1cfc1fc77e44","Type":"ContainerStarted","Data":"4982d80de10a2ffa83b3a05d649ed0a52a93c8fee37931a38e37520f7e0db035"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.957248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" event={"ID":"a2de2777-57e1-4310-a878-1cfc1fc77e44","Type":"ContainerStarted","Data":"f24ef2dd48b7431034546306162844649d0ed3d2e1c74ff319399c5d956ac22e"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.958183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.959820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" event={"ID":"4d9597a8-43b0-4f3e-adb9-5f0d32479431","Type":"ContainerStarted","Data":"a06ed4cb968c32d980ff6898fdf2fbbb3c2fdae55a9d73f63a4279b52e120cd0"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.960058 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.960210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" event={"ID":"4d9597a8-43b0-4f3e-adb9-5f0d32479431","Type":"ContainerStarted","Data":"5be6a5ac2fd99a192bb8e8787f6170afcbaf23bcc8e9fdb4f2ffddbf94d3514c"} Mar 20 17:25:01 crc kubenswrapper[4795]: I0320 17:25:01.964345 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.000707 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" podStartSLOduration=2.000654018 podStartE2EDuration="2.000654018s" podCreationTimestamp="2026-03-20 17:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:25:02.000112811 +0000 UTC m=+445.458144422" watchObservedRunningTime="2026-03-20 17:25:02.000654018 +0000 UTC m=+445.458685599" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.005090 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8x76m" podStartSLOduration=2.00507188 podStartE2EDuration="2.00507188s" podCreationTimestamp="2026-03-20 17:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:25:01.977504057 +0000 UTC m=+445.435535618" watchObservedRunningTime="2026-03-20 17:25:02.00507188 +0000 UTC m=+445.463103461" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.448671 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nwm6j"] Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449018 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449045 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449052 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449060 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449067 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449075 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449102 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449113 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449120 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449130 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449147 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449154 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="extract-content" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449167 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449176 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449184 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449190 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449200 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449208 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449224 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449234 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="extract-utilities" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449361 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd16fcdb-824f-4c4e-8cf2-0c33b15cff4f" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449374 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57849322-f280-42ee-a330-18120aeed5db" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449384 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449396 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449409 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" containerName="registry-server" Mar 20 17:25:02 crc kubenswrapper[4795]: E0320 17:25:02.449519 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.449529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1a790f-ddf0-4512-88c5-dba972460e8a" containerName="marketplace-operator" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.450248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.453033 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.462854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwm6j"] Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.533323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r4h\" (UniqueName: \"kubernetes.io/projected/38a67438-04e3-433b-9b32-47acf98b3086-kube-api-access-j7r4h\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.533445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-utilities\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.533493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-catalog-content\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.634634 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r4h\" (UniqueName: \"kubernetes.io/projected/38a67438-04e3-433b-9b32-47acf98b3086-kube-api-access-j7r4h\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.634856 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-utilities\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.634982 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-catalog-content\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.635413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-catalog-content\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.635750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a67438-04e3-433b-9b32-47acf98b3086-utilities\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.656655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.673595 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.674134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r4h\" (UniqueName: \"kubernetes.io/projected/38a67438-04e3-433b-9b32-47acf98b3086-kube-api-access-j7r4h\") pod \"redhat-marketplace-nwm6j\" (UID: \"38a67438-04e3-433b-9b32-47acf98b3086\") " pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.676920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.683056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.736648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.736748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.736797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.777503 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.837835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.837958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.838023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.838221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.838519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:02 crc kubenswrapper[4795]: I0320 17:25:02.858455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"certified-operators-94mw5\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.012114 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwm6j"] Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.029121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.258738 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70000016-e928-4b11-a31d-4d08e9450a1c" path="/var/lib/kubelet/pods/70000016-e928-4b11-a31d-4d08e9450a1c/volumes" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.259842 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73dd05f7-2cc4-4a99-b12d-26e4d436acca" path="/var/lib/kubelet/pods/73dd05f7-2cc4-4a99-b12d-26e4d436acca/volumes" Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.260443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:25:03 crc kubenswrapper[4795]: W0320 17:25:03.280114 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79f11dc_5b5e_4929_9a6f_281ade73c24a.slice/crio-5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd WatchSource:0}: Error finding container 5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd: Status 404 returned error can't find the container with id 5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.971890 4795 generic.go:334] "Generic (PLEG): container finished" podID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerID="ec6a69189563a780b942ae970e8e1801846953cabcf1239c190354a1203053b4" exitCode=0 Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.972002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"ec6a69189563a780b942ae970e8e1801846953cabcf1239c190354a1203053b4"} Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.972239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerStarted","Data":"5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd"} Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.973654 4795 generic.go:334] "Generic (PLEG): container finished" podID="38a67438-04e3-433b-9b32-47acf98b3086" containerID="2b59f88f4a5d6912f063edde8df8197451820dd6052a0a1a69bab7cad387c6f9" exitCode=0 Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.974389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerDied","Data":"2b59f88f4a5d6912f063edde8df8197451820dd6052a0a1a69bab7cad387c6f9"} Mar 20 17:25:03 crc kubenswrapper[4795]: I0320 17:25:03.974430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerStarted","Data":"774dbfb3b638328353f7b91c57600c5171af1dd49c62166305a2e0b942f5bd70"} Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.847518 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n22t9"] Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.848795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.852473 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.867389 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n22t9"] Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.979392 4795 generic.go:334] "Generic (PLEG): container finished" podID="38a67438-04e3-433b-9b32-47acf98b3086" containerID="8a6a21b4b6e94c1716bbf31482dd407fbaadbaad29cd8b437718a3eed01d163b" exitCode=0 Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.979440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerDied","Data":"8a6a21b4b6e94c1716bbf31482dd407fbaadbaad29cd8b437718a3eed01d163b"} Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.981812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jkr4\" (UniqueName: \"kubernetes.io/projected/fb801735-41d3-4c6e-b9e7-083ad510100a-kube-api-access-2jkr4\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.981952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-catalog-content\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:04 crc kubenswrapper[4795]: I0320 17:25:04.982026 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-utilities\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.053551 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.054740 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.074886 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.088430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-catalog-content\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.088537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-utilities\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.088640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jkr4\" (UniqueName: \"kubernetes.io/projected/fb801735-41d3-4c6e-b9e7-083ad510100a-kube-api-access-2jkr4\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.089649 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-utilities\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.089872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb801735-41d3-4c6e-b9e7-083ad510100a-catalog-content\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.092857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.108478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jkr4\" (UniqueName: \"kubernetes.io/projected/fb801735-41d3-4c6e-b9e7-083ad510100a-kube-api-access-2jkr4\") pod \"redhat-operators-n22t9\" (UID: \"fb801735-41d3-4c6e-b9e7-083ad510100a\") " pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.181419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.190073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.190132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.190168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.291885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.321712 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"community-operators-tw8kt\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.369012 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n22t9"] Mar 20 17:25:05 crc kubenswrapper[4795]: W0320 17:25:05.378818 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb801735_41d3_4c6e_b9e7_083ad510100a.slice/crio-10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767 WatchSource:0}: Error finding container 10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767: Status 404 returned error can't find the container with id 10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.400085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.569158 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:25:05 crc kubenswrapper[4795]: W0320 17:25:05.585408 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cba71d7_62e8_4541_9728_23dd5ff4b982.slice/crio-dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5 WatchSource:0}: Error finding container dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5: Status 404 returned error can't find the container with id dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.986569 4795 generic.go:334] "Generic (PLEG): container finished" podID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerID="099eb6fe1b44619943ee789acf319c90001ea00f649ef59a36a0aa98e76bd549" exitCode=0 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.986646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"099eb6fe1b44619943ee789acf319c90001ea00f649ef59a36a0aa98e76bd549"} Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.986676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerStarted","Data":"dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5"} Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.988063 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb801735-41d3-4c6e-b9e7-083ad510100a" containerID="f2d66cb7edd3c4882b542108d443235dd96024cbaf91d61c20130ec249f2d423" exitCode=0 Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.988103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerDied","Data":"f2d66cb7edd3c4882b542108d443235dd96024cbaf91d61c20130ec249f2d423"} Mar 20 17:25:05 crc kubenswrapper[4795]: I0320 17:25:05.988128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerStarted","Data":"10b2d86feb55feb90fd004975acafbf78736cfa66cdbf35ae03cee5ef43d6767"} Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:06.998170 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwm6j" event={"ID":"38a67438-04e3-433b-9b32-47acf98b3086","Type":"ContainerStarted","Data":"502dc732bec40dfb721185d1e3ed96c41b58a81919acd65a8913bb1006af66d6"} Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:07.001320 4795 generic.go:334] "Generic (PLEG): container finished" podID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerID="88b194a74064309f622b8e25f76f210948d20e5936b41beb91453d2773fb7483" exitCode=0 Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:07.001345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"88b194a74064309f622b8e25f76f210948d20e5936b41beb91453d2773fb7483"} Mar 20 17:25:07 crc kubenswrapper[4795]: I0320 17:25:07.019222 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nwm6j" podStartSLOduration=3.107960716 podStartE2EDuration="5.01919827s" podCreationTimestamp="2026-03-20 17:25:02 +0000 UTC" firstStartedPulling="2026-03-20 17:25:03.975544703 +0000 UTC m=+447.433576234" lastFinishedPulling="2026-03-20 17:25:05.886782217 +0000 UTC m=+449.344813788" observedRunningTime="2026-03-20 17:25:07.017169254 +0000 UTC m=+450.475200815" watchObservedRunningTime="2026-03-20 17:25:07.01919827 +0000 UTC m=+450.477229811" Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.009493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerStarted","Data":"4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2"} Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.012242 4795 generic.go:334] "Generic (PLEG): container finished" podID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerID="8963fe7721a09d9f6c228e790432497f6b1fff70d60afc4485e7fcd92391890f" exitCode=0 Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.012443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"8963fe7721a09d9f6c228e790432497f6b1fff70d60afc4485e7fcd92391890f"} Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.015289 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb801735-41d3-4c6e-b9e7-083ad510100a" containerID="b4ccc0976c4f09d21e82657e8bb7bfc336959e539c86beb3d45bb5f675188895" exitCode=0 Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.016300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerDied","Data":"b4ccc0976c4f09d21e82657e8bb7bfc336959e539c86beb3d45bb5f675188895"} Mar 20 17:25:08 crc kubenswrapper[4795]: I0320 17:25:08.037717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94mw5" podStartSLOduration=2.20988869 podStartE2EDuration="6.037667991s" podCreationTimestamp="2026-03-20 17:25:02 +0000 UTC" firstStartedPulling="2026-03-20 17:25:03.974417246 +0000 UTC m=+447.432448787" lastFinishedPulling="2026-03-20 17:25:07.802196517 +0000 UTC m=+451.260228088" observedRunningTime="2026-03-20 17:25:08.034283432 +0000 UTC m=+451.492314983" watchObservedRunningTime="2026-03-20 17:25:08.037667991 +0000 UTC m=+451.495699542" Mar 20 17:25:09 crc kubenswrapper[4795]: I0320 17:25:09.023580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerStarted","Data":"e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b"} Mar 20 17:25:09 crc kubenswrapper[4795]: I0320 17:25:09.042543 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tw8kt" podStartSLOduration=1.424651882 podStartE2EDuration="4.042525416s" podCreationTimestamp="2026-03-20 17:25:05 +0000 UTC" firstStartedPulling="2026-03-20 17:25:06.031289228 +0000 UTC m=+449.489320769" lastFinishedPulling="2026-03-20 17:25:08.649162752 +0000 UTC m=+452.107194303" observedRunningTime="2026-03-20 17:25:09.038115045 +0000 UTC m=+452.496146606" watchObservedRunningTime="2026-03-20 17:25:09.042525416 +0000 UTC m=+452.500556967" Mar 20 17:25:10 crc kubenswrapper[4795]: I0320 17:25:10.031987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n22t9" event={"ID":"fb801735-41d3-4c6e-b9e7-083ad510100a","Type":"ContainerStarted","Data":"07f9f94328a1c188e056e6fb632a62a1885a43085acf040b09ef3b2300259b0f"} Mar 20 17:25:10 crc kubenswrapper[4795]: I0320 17:25:10.056987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n22t9" podStartSLOduration=3.202547156 podStartE2EDuration="6.056969079s" podCreationTimestamp="2026-03-20 17:25:04 +0000 UTC" firstStartedPulling="2026-03-20 17:25:06.031371681 +0000 UTC m=+449.489403262" lastFinishedPulling="2026-03-20 17:25:08.885793634 +0000 UTC m=+452.343825185" observedRunningTime="2026-03-20 17:25:10.055520982 +0000 UTC m=+453.513552533" watchObservedRunningTime="2026-03-20 17:25:10.056969079 +0000 UTC m=+453.515000620" Mar 20 17:25:11 crc kubenswrapper[4795]: I0320 17:25:11.300464 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:25:11 crc kubenswrapper[4795]: I0320 17:25:11.300518 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:25:12 crc kubenswrapper[4795]: I0320 17:25:12.778357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:12 crc kubenswrapper[4795]: I0320 17:25:12.778931 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:12 crc kubenswrapper[4795]: I0320 17:25:12.843135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.030256 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.030315 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.087532 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.127577 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nwm6j" Mar 20 17:25:13 crc kubenswrapper[4795]: I0320 17:25:13.149407 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.182000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.182078 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.401070 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.401172 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:15 crc kubenswrapper[4795]: I0320 17:25:15.466875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:16 crc kubenswrapper[4795]: I0320 17:25:16.108658 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:25:16 crc kubenswrapper[4795]: I0320 17:25:16.239412 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n22t9" podUID="fb801735-41d3-4c6e-b9e7-083ad510100a" containerName="registry-server" probeResult="failure" output=< Mar 20 17:25:16 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:25:16 crc kubenswrapper[4795]: > Mar 20 17:25:20 crc kubenswrapper[4795]: I0320 17:25:20.932332 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8jcpg" Mar 20 17:25:21 crc kubenswrapper[4795]: I0320 17:25:21.028671 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:25:25 crc kubenswrapper[4795]: I0320 17:25:25.263818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:25 crc kubenswrapper[4795]: I0320 17:25:25.334435 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n22t9" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.300422 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.301059 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.301143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.302120 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:25:41 crc kubenswrapper[4795]: I0320 17:25:41.302210 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b" gracePeriod=600 Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.419785 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b" exitCode=0 Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.419914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b"} Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.420244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f"} Mar 20 17:25:42 crc kubenswrapper[4795]: I0320 17:25:42.420265 4795 scope.go:117] "RemoveContainer" containerID="6b19cff73ca18043345f90e867b118aa612f5f22a679048b20d6f81354822506" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.069293 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" containerID="cri-o://76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a" gracePeriod=30 Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449578 4795 generic.go:334] "Generic (PLEG): container finished" podID="3dde633a-aefe-4c9b-84a7-301279016583" containerID="76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a" exitCode=0 Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449726 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerDied","Data":"76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a"} Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" event={"ID":"3dde633a-aefe-4c9b-84a7-301279016583","Type":"ContainerDied","Data":"3db277134197ff5142f4f0d85c126502b98d3bb29670b6c4409e582bcdf40d86"} Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.449952 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3db277134197ff5142f4f0d85c126502b98d3bb29670b6c4409e582bcdf40d86" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.452660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.465836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.465909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.465959 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.466016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.466065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.466910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.467181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.467313 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.467862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.468030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"3dde633a-aefe-4c9b-84a7-301279016583\" (UID: \"3dde633a-aefe-4c9b-84a7-301279016583\") " Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.468360 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.468392 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3dde633a-aefe-4c9b-84a7-301279016583-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.473587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.474519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.476950 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.513384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.514677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk" (OuterVolumeSpecName: "kube-api-access-p2bwk") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "kube-api-access-p2bwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.518013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "3dde633a-aefe-4c9b-84a7-301279016583" (UID: "3dde633a-aefe-4c9b-84a7-301279016583"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570014 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2bwk\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-kube-api-access-p2bwk\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570059 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570080 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3dde633a-aefe-4c9b-84a7-301279016583-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570098 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3dde633a-aefe-4c9b-84a7-301279016583-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:46 crc kubenswrapper[4795]: I0320 17:25:46.570115 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3dde633a-aefe-4c9b-84a7-301279016583-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:25:47 crc kubenswrapper[4795]: I0320 17:25:47.456551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6rj8" Mar 20 17:25:47 crc kubenswrapper[4795]: I0320 17:25:47.486549 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:25:47 crc kubenswrapper[4795]: I0320 17:25:47.499813 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6rj8"] Mar 20 17:25:49 crc kubenswrapper[4795]: I0320 17:25:49.263025 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dde633a-aefe-4c9b-84a7-301279016583" path="/var/lib/kubelet/pods/3dde633a-aefe-4c9b-84a7-301279016583/volumes" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.146785 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:26:00 crc kubenswrapper[4795]: E0320 17:26:00.148005 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.148033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.148291 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dde633a-aefe-4c9b-84a7-301279016583" containerName="registry" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.149136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.156516 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.157378 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.157868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.158286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.256862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"auto-csr-approver-29567126-nhz8w\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.358802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"auto-csr-approver-29567126-nhz8w\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.393662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"auto-csr-approver-29567126-nhz8w\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.482751 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:00 crc kubenswrapper[4795]: I0320 17:26:00.735914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:26:00 crc kubenswrapper[4795]: W0320 17:26:00.744453 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod740c1ddf_96e5_46f6_837c_73372748464e.slice/crio-127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6 WatchSource:0}: Error finding container 127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6: Status 404 returned error can't find the container with id 127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6 Mar 20 17:26:01 crc kubenswrapper[4795]: I0320 17:26:01.573993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" event={"ID":"740c1ddf-96e5-46f6-837c-73372748464e","Type":"ContainerStarted","Data":"127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6"} Mar 20 17:26:02 crc kubenswrapper[4795]: I0320 17:26:02.584071 4795 generic.go:334] "Generic (PLEG): container finished" podID="740c1ddf-96e5-46f6-837c-73372748464e" containerID="ea095688dd8877661afbf85ce172a04981e2524e4cbc5e45ea0fa637fadfbc39" exitCode=0 Mar 20 17:26:02 crc kubenswrapper[4795]: I0320 17:26:02.584211 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" event={"ID":"740c1ddf-96e5-46f6-837c-73372748464e","Type":"ContainerDied","Data":"ea095688dd8877661afbf85ce172a04981e2524e4cbc5e45ea0fa637fadfbc39"} Mar 20 17:26:03 crc kubenswrapper[4795]: I0320 17:26:03.834470 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.006668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") pod \"740c1ddf-96e5-46f6-837c-73372748464e\" (UID: \"740c1ddf-96e5-46f6-837c-73372748464e\") " Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.015873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h" (OuterVolumeSpecName: "kube-api-access-8cn9h") pod "740c1ddf-96e5-46f6-837c-73372748464e" (UID: "740c1ddf-96e5-46f6-837c-73372748464e"). InnerVolumeSpecName "kube-api-access-8cn9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.108166 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cn9h\" (UniqueName: \"kubernetes.io/projected/740c1ddf-96e5-46f6-837c-73372748464e-kube-api-access-8cn9h\") on node \"crc\" DevicePath \"\"" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.604304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" event={"ID":"740c1ddf-96e5-46f6-837c-73372748464e","Type":"ContainerDied","Data":"127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6"} Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.604362 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127be9160d5e5111d0ea282517e8cf88adc08316653bccc8950c3de8a6e300f6" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.604396 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-nhz8w" Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.915514 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:26:04 crc kubenswrapper[4795]: I0320 17:26:04.920566 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-j7789"] Mar 20 17:26:05 crc kubenswrapper[4795]: I0320 17:26:05.260512 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed1d31b-b060-45c3-95bf-3b226a36efe1" path="/var/lib/kubelet/pods/bed1d31b-b060-45c3-95bf-3b226a36efe1/volumes" Mar 20 17:27:49 crc kubenswrapper[4795]: I0320 17:27:49.085854 4795 scope.go:117] "RemoveContainer" containerID="76aa98549ce46db60ce0a3b7fd4c6b9ed28e4c1b7375fc84abcdb33fcf4ef287" Mar 20 17:27:49 crc kubenswrapper[4795]: I0320 17:27:49.135497 4795 scope.go:117] "RemoveContainer" containerID="76b0d688b149e45910b8799bbb4e20410e0480f70929e235a28f86178319123a" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.149637 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:28:00 crc kubenswrapper[4795]: E0320 17:28:00.150324 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740c1ddf-96e5-46f6-837c-73372748464e" containerName="oc" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.150345 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="740c1ddf-96e5-46f6-837c-73372748464e" containerName="oc" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.150525 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="740c1ddf-96e5-46f6-837c-73372748464e" containerName="oc" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.151151 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.154759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.154826 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.161298 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.165145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.165256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"auto-csr-approver-29567128-bqp8h\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.267046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"auto-csr-approver-29567128-bqp8h\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.301642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"auto-csr-approver-29567128-bqp8h\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.479447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.980771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:28:00 crc kubenswrapper[4795]: I0320 17:28:00.989877 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:28:01 crc kubenswrapper[4795]: I0320 17:28:01.438326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" event={"ID":"4be9f091-42a0-432b-8f14-700bc3e733cb","Type":"ContainerStarted","Data":"cfd52f60c14ca3ef1e9f12424bfa6942e34a20ac9b57389e7f65a806f4083442"} Mar 20 17:28:03 crc kubenswrapper[4795]: I0320 17:28:03.469468 4795 generic.go:334] "Generic (PLEG): container finished" podID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerID="326b8c75bc495d3f796856aa4f0f247f31974ad88ddb26ad9ca2ca9ec8cf372a" exitCode=0 Mar 20 17:28:03 crc kubenswrapper[4795]: I0320 17:28:03.469536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" event={"ID":"4be9f091-42a0-432b-8f14-700bc3e733cb","Type":"ContainerDied","Data":"326b8c75bc495d3f796856aa4f0f247f31974ad88ddb26ad9ca2ca9ec8cf372a"} Mar 20 17:28:04 crc kubenswrapper[4795]: I0320 17:28:04.809474 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:04 crc kubenswrapper[4795]: I0320 17:28:04.934270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") pod \"4be9f091-42a0-432b-8f14-700bc3e733cb\" (UID: \"4be9f091-42a0-432b-8f14-700bc3e733cb\") " Mar 20 17:28:04 crc kubenswrapper[4795]: I0320 17:28:04.940727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4" (OuterVolumeSpecName: "kube-api-access-7lhp4") pod "4be9f091-42a0-432b-8f14-700bc3e733cb" (UID: "4be9f091-42a0-432b-8f14-700bc3e733cb"). InnerVolumeSpecName "kube-api-access-7lhp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.036838 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lhp4\" (UniqueName: \"kubernetes.io/projected/4be9f091-42a0-432b-8f14-700bc3e733cb-kube-api-access-7lhp4\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.485609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" event={"ID":"4be9f091-42a0-432b-8f14-700bc3e733cb","Type":"ContainerDied","Data":"cfd52f60c14ca3ef1e9f12424bfa6942e34a20ac9b57389e7f65a806f4083442"} Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.485671 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd52f60c14ca3ef1e9f12424bfa6942e34a20ac9b57389e7f65a806f4083442" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.485791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-bqp8h" Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.889755 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:28:05 crc kubenswrapper[4795]: I0320 17:28:05.893739 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-fns4l"] Mar 20 17:28:07 crc kubenswrapper[4795]: I0320 17:28:07.263133 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0486c12-c384-46ff-925b-bfeefb1d59bb" path="/var/lib/kubelet/pods/a0486c12-c384-46ff-925b-bfeefb1d59bb/volumes" Mar 20 17:28:11 crc kubenswrapper[4795]: I0320 17:28:11.300203 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:28:11 crc kubenswrapper[4795]: I0320 17:28:11.300283 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:28:41 crc kubenswrapper[4795]: I0320 17:28:41.300800 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:28:41 crc kubenswrapper[4795]: I0320 17:28:41.303083 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:28:49 crc kubenswrapper[4795]: I0320 17:28:49.188941 4795 scope.go:117] "RemoveContainer" containerID="14e15a12796f646063cb5f653e99e6ad23f1724726dfb97b08e9621c085665c1" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.300494 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.302205 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.302368 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.303382 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:29:11 crc kubenswrapper[4795]: I0320 17:29:11.303491 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f" gracePeriod=600 Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260293 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f" exitCode=0 Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f"} Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842"} Mar 20 17:29:12 crc kubenswrapper[4795]: I0320 17:29:12.260953 4795 scope.go:117] "RemoveContainer" containerID="c5f72d4a4b206c3818fa7ca9603990a841d9414a46afec002ef14de99a2a6f6b" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.140718 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:30:00 crc kubenswrapper[4795]: E0320 17:30:00.141622 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerName="oc" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.141645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerName="oc" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.141875 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" containerName="oc" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.142589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.145910 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.146659 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.147509 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.150789 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.151070 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.152027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.152518 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.161756 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.169120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.323856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"auto-csr-approver-29567130-kh5md\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.324006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.324070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.324154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.424867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.424971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"auto-csr-approver-29567130-kh5md\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.425033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.425065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.426752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.440172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.445671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"collect-profiles-29567130-p6qhn\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.447766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"auto-csr-approver-29567130-kh5md\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.465403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.477558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.729774 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:30:00 crc kubenswrapper[4795]: I0320 17:30:00.794966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 17:30:00 crc kubenswrapper[4795]: W0320 17:30:00.801915 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6892589_ca9a_45cc_8991_ab0029e67e3c.slice/crio-f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0 WatchSource:0}: Error finding container f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0: Status 404 returned error can't find the container with id f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0 Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.592999 4795 generic.go:334] "Generic (PLEG): container finished" podID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerID="c957ead85ece246848e605f8f78734d00ae750bd985db9f200ae787909bd1425" exitCode=0 Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.593061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" event={"ID":"a6892589-ca9a-45cc-8991-ab0029e67e3c","Type":"ContainerDied","Data":"c957ead85ece246848e605f8f78734d00ae750bd985db9f200ae787909bd1425"} Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.593491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" event={"ID":"a6892589-ca9a-45cc-8991-ab0029e67e3c","Type":"ContainerStarted","Data":"f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0"} Mar 20 17:30:01 crc kubenswrapper[4795]: I0320 17:30:01.596940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-kh5md" event={"ID":"f93986a1-82a8-4eac-ba5e-f790196b25ce","Type":"ContainerStarted","Data":"f2c40797c31a64b650cf210f5c7ba69205ad6d31cc00e971cfce59abdfa1f4eb"} Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.604317 4795 generic.go:334] "Generic (PLEG): container finished" podID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerID="56b4e175842a208b79b6d416a354b0c057585a391bd973a1b6ce26b23a0cd738" exitCode=0 Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.604422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-kh5md" event={"ID":"f93986a1-82a8-4eac-ba5e-f790196b25ce","Type":"ContainerDied","Data":"56b4e175842a208b79b6d416a354b0c057585a391bd973a1b6ce26b23a0cd738"} Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.811549 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.955607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") pod \"a6892589-ca9a-45cc-8991-ab0029e67e3c\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.956030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") pod \"a6892589-ca9a-45cc-8991-ab0029e67e3c\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.956141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") pod \"a6892589-ca9a-45cc-8991-ab0029e67e3c\" (UID: \"a6892589-ca9a-45cc-8991-ab0029e67e3c\") " Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.956785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a6892589-ca9a-45cc-8991-ab0029e67e3c" (UID: "a6892589-ca9a-45cc-8991-ab0029e67e3c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.961532 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb" (OuterVolumeSpecName: "kube-api-access-rhbvb") pod "a6892589-ca9a-45cc-8991-ab0029e67e3c" (UID: "a6892589-ca9a-45cc-8991-ab0029e67e3c"). InnerVolumeSpecName "kube-api-access-rhbvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4795]: I0320 17:30:02.964944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a6892589-ca9a-45cc-8991-ab0029e67e3c" (UID: "a6892589-ca9a-45cc-8991-ab0029e67e3c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.057250 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a6892589-ca9a-45cc-8991-ab0029e67e3c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.057302 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6892589-ca9a-45cc-8991-ab0029e67e3c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.057328 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhbvb\" (UniqueName: \"kubernetes.io/projected/a6892589-ca9a-45cc-8991-ab0029e67e3c-kube-api-access-rhbvb\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.615621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.615711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn" event={"ID":"a6892589-ca9a-45cc-8991-ab0029e67e3c","Type":"ContainerDied","Data":"f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0"} Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.615773 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f109c885b9a003d958942ee3edcf151467d8d69e3710e50c69aa7e0ac747f9a0" Mar 20 17:30:03 crc kubenswrapper[4795]: I0320 17:30:03.870978 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.070107 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") pod \"f93986a1-82a8-4eac-ba5e-f790196b25ce\" (UID: \"f93986a1-82a8-4eac-ba5e-f790196b25ce\") " Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.075936 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss" (OuterVolumeSpecName: "kube-api-access-sx5ss") pod "f93986a1-82a8-4eac-ba5e-f790196b25ce" (UID: "f93986a1-82a8-4eac-ba5e-f790196b25ce"). InnerVolumeSpecName "kube-api-access-sx5ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.172174 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx5ss\" (UniqueName: \"kubernetes.io/projected/f93986a1-82a8-4eac-ba5e-f790196b25ce-kube-api-access-sx5ss\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.622984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-kh5md" event={"ID":"f93986a1-82a8-4eac-ba5e-f790196b25ce","Type":"ContainerDied","Data":"f2c40797c31a64b650cf210f5c7ba69205ad6d31cc00e971cfce59abdfa1f4eb"} Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.623043 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c40797c31a64b650cf210f5c7ba69205ad6d31cc00e971cfce59abdfa1f4eb" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.623070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-kh5md" Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.950013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:30:04 crc kubenswrapper[4795]: I0320 17:30:04.956953 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-wjlwc"] Mar 20 17:30:05 crc kubenswrapper[4795]: I0320 17:30:05.265526 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c14395-0a4c-47be-8f64-382e60e3faad" path="/var/lib/kubelet/pods/35c14395-0a4c-47be-8f64-382e60e3faad/volumes" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.593475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-smr2n"] Mar 20 17:30:37 crc kubenswrapper[4795]: E0320 17:30:37.596206 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerName="oc" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.596419 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerName="oc" Mar 20 17:30:37 crc kubenswrapper[4795]: E0320 17:30:37.596850 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerName="collect-profiles" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.596966 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerName="collect-profiles" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.597322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" containerName="collect-profiles" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.597552 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" containerName="oc" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.598583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.601403 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.601705 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-f2z74" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.601829 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.614744 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-smr2n"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.624787 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-lqmsr"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.625448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.627129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-c75zp" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.634215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cff8c"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.640078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.645353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-c7fr7" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.674117 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cff8c"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.690534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lqmsr"] Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.772992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfwf\" (UniqueName: \"kubernetes.io/projected/88832f68-9f72-4321-8d3f-bb3e23465fdb-kube-api-access-mmfwf\") pod \"cert-manager-webhook-687f57d79b-cff8c\" (UID: \"88832f68-9f72-4321-8d3f-bb3e23465fdb\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.773066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwwtf\" (UniqueName: \"kubernetes.io/projected/7df834a3-0298-4cc9-8b4e-49ce3f51183e-kube-api-access-pwwtf\") pod \"cert-manager-cainjector-cf98fcc89-smr2n\" (UID: \"7df834a3-0298-4cc9-8b4e-49ce3f51183e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.773090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dzz\" (UniqueName: \"kubernetes.io/projected/5231a25a-8bda-4f72-8a81-e5a49cdc31eb-kube-api-access-58dzz\") pod \"cert-manager-858654f9db-lqmsr\" (UID: \"5231a25a-8bda-4f72-8a81-e5a49cdc31eb\") " pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.874332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfwf\" (UniqueName: \"kubernetes.io/projected/88832f68-9f72-4321-8d3f-bb3e23465fdb-kube-api-access-mmfwf\") pod \"cert-manager-webhook-687f57d79b-cff8c\" (UID: \"88832f68-9f72-4321-8d3f-bb3e23465fdb\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.874506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwwtf\" (UniqueName: \"kubernetes.io/projected/7df834a3-0298-4cc9-8b4e-49ce3f51183e-kube-api-access-pwwtf\") pod \"cert-manager-cainjector-cf98fcc89-smr2n\" (UID: \"7df834a3-0298-4cc9-8b4e-49ce3f51183e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.874549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dzz\" (UniqueName: \"kubernetes.io/projected/5231a25a-8bda-4f72-8a81-e5a49cdc31eb-kube-api-access-58dzz\") pod \"cert-manager-858654f9db-lqmsr\" (UID: \"5231a25a-8bda-4f72-8a81-e5a49cdc31eb\") " pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.895516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dzz\" (UniqueName: \"kubernetes.io/projected/5231a25a-8bda-4f72-8a81-e5a49cdc31eb-kube-api-access-58dzz\") pod \"cert-manager-858654f9db-lqmsr\" (UID: \"5231a25a-8bda-4f72-8a81-e5a49cdc31eb\") " pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.900707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwwtf\" (UniqueName: \"kubernetes.io/projected/7df834a3-0298-4cc9-8b4e-49ce3f51183e-kube-api-access-pwwtf\") pod \"cert-manager-cainjector-cf98fcc89-smr2n\" (UID: \"7df834a3-0298-4cc9-8b4e-49ce3f51183e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.911424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfwf\" (UniqueName: \"kubernetes.io/projected/88832f68-9f72-4321-8d3f-bb3e23465fdb-kube-api-access-mmfwf\") pod \"cert-manager-webhook-687f57d79b-cff8c\" (UID: \"88832f68-9f72-4321-8d3f-bb3e23465fdb\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.923001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.948509 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-lqmsr" Mar 20 17:30:37 crc kubenswrapper[4795]: I0320 17:30:37.963143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.137241 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-smr2n"] Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.157179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" event={"ID":"7df834a3-0298-4cc9-8b4e-49ce3f51183e","Type":"ContainerStarted","Data":"efd5b0b29a9788b0c0d1841898dab5499cecdede2ffed8cf44a49ca57874eabf"} Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.392223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-lqmsr"] Mar 20 17:30:38 crc kubenswrapper[4795]: W0320 17:30:38.395331 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5231a25a_8bda_4f72_8a81_e5a49cdc31eb.slice/crio-a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366 WatchSource:0}: Error finding container a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366: Status 404 returned error can't find the container with id a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366 Mar 20 17:30:38 crc kubenswrapper[4795]: I0320 17:30:38.395782 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cff8c"] Mar 20 17:30:38 crc kubenswrapper[4795]: W0320 17:30:38.398768 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88832f68_9f72_4321_8d3f_bb3e23465fdb.slice/crio-f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff WatchSource:0}: Error finding container f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff: Status 404 returned error can't find the container with id f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff Mar 20 17:30:39 crc kubenswrapper[4795]: I0320 17:30:39.164786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lqmsr" event={"ID":"5231a25a-8bda-4f72-8a81-e5a49cdc31eb","Type":"ContainerStarted","Data":"a4c2b9640381d9fa568d89513a5f9297f28795b992c4bdd4796e60290e5a3366"} Mar 20 17:30:39 crc kubenswrapper[4795]: I0320 17:30:39.165841 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" event={"ID":"88832f68-9f72-4321-8d3f-bb3e23465fdb","Type":"ContainerStarted","Data":"f1b1c3d62274195c3190df7984c1b8d6d7bed0164a55be030f7fa0f8dae143ff"} Mar 20 17:30:43 crc kubenswrapper[4795]: I0320 17:30:43.469974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" event={"ID":"7df834a3-0298-4cc9-8b4e-49ce3f51183e","Type":"ContainerStarted","Data":"5f77e9227173777a2cb8dcfa07abdecd019fd9d7c0e54e1e1c537afb1f54b789"} Mar 20 17:30:43 crc kubenswrapper[4795]: I0320 17:30:43.501835 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-smr2n" podStartSLOduration=4.124181095 podStartE2EDuration="6.501813893s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:38.145805883 +0000 UTC m=+781.603837424" lastFinishedPulling="2026-03-20 17:30:40.523438661 +0000 UTC m=+783.981470222" observedRunningTime="2026-03-20 17:30:43.487887213 +0000 UTC m=+786.945918754" watchObservedRunningTime="2026-03-20 17:30:43.501813893 +0000 UTC m=+786.959845434" Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.479907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-lqmsr" event={"ID":"5231a25a-8bda-4f72-8a81-e5a49cdc31eb","Type":"ContainerStarted","Data":"4e8a02576ca9d01c5c5222d196331e0bb56a92554c3c73fdb749b55304af5769"} Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.482200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" event={"ID":"88832f68-9f72-4321-8d3f-bb3e23465fdb","Type":"ContainerStarted","Data":"a3cbde9ffa21c0111c085397b230b40fd95ebae1d34a07cab60bd797651e9143"} Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.482737 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.527071 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-lqmsr" podStartSLOduration=2.279664895 podStartE2EDuration="7.527054003s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:38.398355354 +0000 UTC m=+781.856386895" lastFinishedPulling="2026-03-20 17:30:43.645744442 +0000 UTC m=+787.103776003" observedRunningTime="2026-03-20 17:30:44.50007845 +0000 UTC m=+787.958110051" watchObservedRunningTime="2026-03-20 17:30:44.527054003 +0000 UTC m=+787.985085544" Mar 20 17:30:44 crc kubenswrapper[4795]: I0320 17:30:44.529165 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" podStartSLOduration=2.216598082 podStartE2EDuration="7.529159319s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:38.400605925 +0000 UTC m=+781.858637466" lastFinishedPulling="2026-03-20 17:30:43.713167162 +0000 UTC m=+787.171198703" observedRunningTime="2026-03-20 17:30:44.525763712 +0000 UTC m=+787.983795273" watchObservedRunningTime="2026-03-20 17:30:44.529159319 +0000 UTC m=+787.987190860" Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.734380 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.736483 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" containerID="cri-o://c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.736954 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" containerID="cri-o://ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737001 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" containerID="cri-o://6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737087 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" containerID="cri-o://f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.736974 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737173 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" containerID="cri-o://2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.737238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" containerID="cri-o://bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" gracePeriod=30 Mar 20 17:30:47 crc kubenswrapper[4795]: I0320 17:30:47.830280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" containerID="cri-o://c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" gracePeriod=30 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.115829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.121652 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-acl-logging/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.122750 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-controller/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.123494 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.203869 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-srjsg"] Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204275 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204312 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204339 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204357 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204380 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204399 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204429 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204448 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204477 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204495 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204524 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204545 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204576 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204593 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204622 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204640 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kubecfg-setup" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204680 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kubecfg-setup" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204761 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204783 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204800 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.204827 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.204846 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205068 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="sbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205102 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205125 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="northd" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205150 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205172 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205197 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="kube-rbac-proxy-node" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205215 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="nbdb" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205237 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovn-acl-logging" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205252 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205266 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205283 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.205476 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205492 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.205660 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerName="ovnkube-controller" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.209794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.261998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") pod \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\" (UID: \"520bb74b-cfa2-4f21-b561-989b0a3d6adc\") " Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262432 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262466 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262103 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262147 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log" (OuterVolumeSpecName: "node-log") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket" (OuterVolumeSpecName: "log-socket") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash" (OuterVolumeSpecName: "host-slash") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.262935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.263101 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.271527 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.271796 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5" (OuterVolumeSpecName: "kube-api-access-4vrl5") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "kube-api-access-4vrl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.277031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "520bb74b-cfa2-4f21-b561-989b0a3d6adc" (UID: "520bb74b-cfa2-4f21-b561-989b0a3d6adc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363198 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-systemd-units\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-etc-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-log-socket\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-bin\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-node-log\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363608 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-systemd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-var-lib-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-script-lib\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363796 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-slash\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-netns\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-ovn\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxpdl\" (UniqueName: \"kubernetes.io/projected/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-kube-api-access-zxpdl\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.363967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-env-overrides\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovn-node-metrics-cert\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-netd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-kubelet\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-config\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364618 4795 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364644 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364728 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364751 4795 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364771 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vrl5\" (UniqueName: \"kubernetes.io/projected/520bb74b-cfa2-4f21-b561-989b0a3d6adc-kube-api-access-4vrl5\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364790 4795 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364809 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364825 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/520bb74b-cfa2-4f21-b561-989b0a3d6adc-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364843 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364859 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364877 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/520bb74b-cfa2-4f21-b561-989b0a3d6adc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364895 4795 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364914 4795 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.364931 4795 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365035 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365068 4795 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365125 4795 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365144 4795 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.365166 4795 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/520bb74b-cfa2-4f21-b561-989b0a3d6adc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.465977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-netd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-kubelet\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-config\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-systemd-units\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-etc-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466323 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-netd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-log-socket\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-systemd-units\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-bin\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-node-log\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-systemd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-var-lib-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-script-lib\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-slash\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-netns\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-ovn\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxpdl\" (UniqueName: \"kubernetes.io/projected/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-kube-api-access-zxpdl\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-env-overrides\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-var-lib-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466963 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-ovn\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-etc-openvswitch\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-kubelet\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-log-socket\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-cni-bin\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-node-log\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467240 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-run-systemd\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-slash\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-run-netns\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.467828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-config\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.468041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-env-overrides\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.468153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovnkube-script-lib\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.466866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.468744 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovn-node-metrics-cert\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.472485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-ovn-node-metrics-cert\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.487399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxpdl\" (UniqueName: \"kubernetes.io/projected/4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5-kube-api-access-zxpdl\") pod \"ovnkube-node-srjsg\" (UID: \"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.539405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.543211 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/2.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544071 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/1.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544145 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" containerID="199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7" exitCode=2 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerDied","Data":"199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544282 4795 scope.go:117] "RemoveContainer" containerID="c1e8c865438f7361dabdc8244c838dc33061c92200d0436ac30d05eda71a6e86" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.544999 4795 scope.go:117] "RemoveContainer" containerID="199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.545321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xxwb6_openshift-multus(c8c31a7c-6ccb-43e0-9c95-33b85204cc39)\"" pod="openshift-multus/multus-xxwb6" podUID="c8c31a7c-6ccb-43e0-9c95-33b85204cc39" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.549530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovnkube-controller/3.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.555136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-acl-logging/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.556418 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krk7q_520bb74b-cfa2-4f21-b561-989b0a3d6adc/ovn-controller/0.log" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557768 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557813 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557829 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557842 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557856 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557869 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" exitCode=0 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557889 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" exitCode=143 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557904 4795 generic.go:334] "Generic (PLEG): container finished" podID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" exitCode=143 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557951 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.557978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558079 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558095 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558107 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558119 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558130 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558142 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558153 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558164 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558175 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558187 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558221 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558235 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558245 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558255 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558266 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558276 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558286 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558296 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558306 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558316 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558345 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558357 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558368 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558379 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558390 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558400 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558411 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558422 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558432 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558442 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krk7q" event={"ID":"520bb74b-cfa2-4f21-b561-989b0a3d6adc","Type":"ContainerDied","Data":"1a7cad6fc70f9635016cf59ae47845a4cfbc41683f6ddf222d2b7bd36fabfbfb"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558471 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558482 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558493 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558505 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558515 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558527 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558537 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558549 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558559 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.558569 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} Mar 20 17:30:48 crc kubenswrapper[4795]: W0320 17:30:48.569951 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3d8c82_869d_4762_8d8b_56d9d2d2c9e5.slice/crio-7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365 WatchSource:0}: Error finding container 7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365: Status 404 returned error can't find the container with id 7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365 Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.593375 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.630339 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.632070 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-krk7q"] Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.638452 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.662813 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.688629 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.710961 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.739521 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.757484 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.831826 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.848485 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.867142 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.913507 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.914045 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914110 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914154 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.914549 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914576 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914590 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.914857 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914878 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.914891 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915085 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915146 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915162 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915411 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915427 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915440 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915644 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915662 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915677 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.915876 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915905 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.915919 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.916105 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916132 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916147 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.916460 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916527 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916569 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: E0320 17:30:48.916901 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916929 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.916943 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917155 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917176 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917415 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917442 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917738 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.917814 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918114 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918149 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918433 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918459 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918637 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918660 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918904 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.918934 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919138 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919161 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919330 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919352 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919508 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919527 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919707 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919724 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919891 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.919910 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920068 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920101 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920252 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920270 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920444 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920460 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920621 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920638 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920808 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.920827 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921007 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921030 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921216 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921233 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921405 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921430 4795 scope.go:117] "RemoveContainer" containerID="c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921586 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747"} err="failed to get container status \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": rpc error: code = NotFound desc = could not find container \"c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747\": container with ID starting with c6861022e89353d8607ddcda6c289adba4bda30bd52a3c49ac0f8fe1e1d4c747 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921604 4795 scope.go:117] "RemoveContainer" containerID="d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921786 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e"} err="failed to get container status \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": rpc error: code = NotFound desc = could not find container \"d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e\": container with ID starting with d4c94de7d795034750d6a01b28158c79ad370fffd8627b92553e87524c55829e not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921800 4795 scope.go:117] "RemoveContainer" containerID="2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859"} err="failed to get container status \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": rpc error: code = NotFound desc = could not find container \"2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859\": container with ID starting with 2eb2c6c26fc4198b111519b536d627518b3f31fea222851da60e88117a44f859 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.921994 4795 scope.go:117] "RemoveContainer" containerID="bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922145 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d"} err="failed to get container status \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": rpc error: code = NotFound desc = could not find container \"bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d\": container with ID starting with bfa55af4351cd1bd7b23258b960bb1f21736c3feaefffc9e6e245f56271a770d not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922158 4795 scope.go:117] "RemoveContainer" containerID="ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922349 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6"} err="failed to get container status \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": rpc error: code = NotFound desc = could not find container \"ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6\": container with ID starting with ec36afb2e7960d45b53a039357b76805f8c693c637ad4128f18653569ab7f6b6 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922362 4795 scope.go:117] "RemoveContainer" containerID="9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922526 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc"} err="failed to get container status \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": rpc error: code = NotFound desc = could not find container \"9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc\": container with ID starting with 9444b1c5dc2126d27787c403d23c1ea280f55e216d19305e0c9171bbbf7a8dbc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922548 4795 scope.go:117] "RemoveContainer" containerID="6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922730 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c"} err="failed to get container status \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": rpc error: code = NotFound desc = could not find container \"6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c\": container with ID starting with 6335650f2b5783ae4a947e4030f458e16abea66f366d8afa080b14325f5a1d6c not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.922747 4795 scope.go:117] "RemoveContainer" containerID="f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc"} err="failed to get container status \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": rpc error: code = NotFound desc = could not find container \"f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc\": container with ID starting with f293d3376706e35084773109d2a3fa6152c3dbdd3a8d034fdccfd69a83ad90fc not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923042 4795 scope.go:117] "RemoveContainer" containerID="c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892"} err="failed to get container status \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": rpc error: code = NotFound desc = could not find container \"c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892\": container with ID starting with c406148ca88b761ed06a6cede9a6aba2b641548671ffd4c6f4c4db5c02f56892 not found: ID does not exist" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923461 4795 scope.go:117] "RemoveContainer" containerID="a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169" Mar 20 17:30:48 crc kubenswrapper[4795]: I0320 17:30:48.923813 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169"} err="failed to get container status \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": rpc error: code = NotFound desc = could not find container \"a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169\": container with ID starting with a4ea823b2752e1908bd37a1460195c6a198aafa3121c3bcba32eae91bde19169 not found: ID does not exist" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.265272 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520bb74b-cfa2-4f21-b561-989b0a3d6adc" path="/var/lib/kubelet/pods/520bb74b-cfa2-4f21-b561-989b0a3d6adc/volumes" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.288784 4795 scope.go:117] "RemoveContainer" containerID="e208a8a62ce5332bce059cfe9498a63b10989e2ede473bf8237789de0f3da7f0" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.566844 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/2.log" Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.568830 4795 generic.go:334] "Generic (PLEG): container finished" podID="4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5" containerID="005e00657a18f9011e09c890eb2968bb56738f35496a3e1f6ab829d77d35eee1" exitCode=0 Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.568899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerDied","Data":"005e00657a18f9011e09c890eb2968bb56738f35496a3e1f6ab829d77d35eee1"} Mar 20 17:30:49 crc kubenswrapper[4795]: I0320 17:30:49.568948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"7e9ffc5e4fc4ded2c601af31f1f9dd7ad033f54b7515e4d6fe36abdcdc466365"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"e59bf0716bd255cd688ae57e9f097ab38b7b6f13866739bd382abf1678e530ee"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"37577f9e1c030e30cba8d0dd22808abddd25b232d48bd61cb38ce696ab3a22f2"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"1ce5bcd753e77cb30874142addbe046b6b8282c35867894de9e349ff63feac8f"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"d4c6d1d1d49f3494c84d195c8c4d887bc335bb03491acece41d188edc99c984c"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"35eaf00d26807dcd790c0828962a838138c457f4e57cfc0916c67c9e0c56b252"} Mar 20 17:30:50 crc kubenswrapper[4795]: I0320 17:30:50.579922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"c3b0a5cd96aa4486c545a0503e67b092787fa04696de6cb65fc17979ce116958"} Mar 20 17:30:52 crc kubenswrapper[4795]: I0320 17:30:52.967488 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cff8c" Mar 20 17:30:53 crc kubenswrapper[4795]: I0320 17:30:53.605339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"5c3504372687ee1d8851e55ffda227591f5e52a7066c585c151dae88a51adaef"} Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" event={"ID":"4b3d8c82-869d-4762-8d8b-56d9d2d2c9e5","Type":"ContainerStarted","Data":"1063cc41fdd40a7916837151f74b6dfe5ea45cbc82aea21b74b7497907b819fb"} Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623916 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.623929 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.657011 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.657775 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:30:55 crc kubenswrapper[4795]: I0320 17:30:55.676361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" podStartSLOduration=7.676336363 podStartE2EDuration="7.676336363s" podCreationTimestamp="2026-03-20 17:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:30:55.670912282 +0000 UTC m=+799.128943833" watchObservedRunningTime="2026-03-20 17:30:55.676336363 +0000 UTC m=+799.134367944" Mar 20 17:31:00 crc kubenswrapper[4795]: I0320 17:31:00.252505 4795 scope.go:117] "RemoveContainer" containerID="199d60669fc8f63b3b210d2fc85e721bcf838edabcdff0694939a52f882125e7" Mar 20 17:31:00 crc kubenswrapper[4795]: I0320 17:31:00.732821 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xxwb6_c8c31a7c-6ccb-43e0-9c95-33b85204cc39/kube-multus/2.log" Mar 20 17:31:00 crc kubenswrapper[4795]: I0320 17:31:00.733327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xxwb6" event={"ID":"c8c31a7c-6ccb-43e0-9c95-33b85204cc39","Type":"ContainerStarted","Data":"940c532728c5764b45b49b713e9e2b429773e60bb316ae666156a65d826c3a77"} Mar 20 17:31:11 crc kubenswrapper[4795]: I0320 17:31:11.300445 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:31:11 crc kubenswrapper[4795]: I0320 17:31:11.300902 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:31:18 crc kubenswrapper[4795]: I0320 17:31:18.576993 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srjsg" Mar 20 17:31:27 crc kubenswrapper[4795]: I0320 17:31:27.764273 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.442337 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h"] Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.444770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.448678 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.459713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h"] Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.589585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.589763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.589833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.691830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.691976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.692050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.693129 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.695355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.712560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:40 crc kubenswrapper[4795]: I0320 17:31:40.770093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.023543 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h"] Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.300639 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.301093 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.878866 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerID="85d39cd3b7573adaa8cd33998666d4c7e79dd991bc6ef15c2cea7285efac8969" exitCode=0 Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.878973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"85d39cd3b7573adaa8cd33998666d4c7e79dd991bc6ef15c2cea7285efac8969"} Mar 20 17:31:41 crc kubenswrapper[4795]: I0320 17:31:41.880114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerStarted","Data":"98d214b944aebd78a2b668652156b847f1c5ec8fc4b459e0c772615f90095c54"} Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.754551 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.758462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.766894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.928524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.928791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:42 crc kubenswrapper[4795]: I0320 17:31:42.928847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030480 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.030984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.031086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.069095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"redhat-operators-ccgqp\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.094680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.308865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:31:43 crc kubenswrapper[4795]: W0320 17:31:43.322142 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod596dd5ef_f287_4f26_9618_c7763a911124.slice/crio-6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290 WatchSource:0}: Error finding container 6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290: Status 404 returned error can't find the container with id 6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290 Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.890663 4795 generic.go:334] "Generic (PLEG): container finished" podID="596dd5ef-f287-4f26-9618-c7763a911124" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" exitCode=0 Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.890749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d"} Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.890778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerStarted","Data":"6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290"} Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.893533 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerID="dd3c89461c58ed60219166c8784c9924960bbd8b2c8a5bee74fe1482bf5922a7" exitCode=0 Mar 20 17:31:43 crc kubenswrapper[4795]: I0320 17:31:43.893563 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"dd3c89461c58ed60219166c8784c9924960bbd8b2c8a5bee74fe1482bf5922a7"} Mar 20 17:31:44 crc kubenswrapper[4795]: I0320 17:31:44.905278 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerID="863e88b27c645f7b01ba4a1b4c66d16ca94747da1b2a0288c2d28921ca533697" exitCode=0 Mar 20 17:31:44 crc kubenswrapper[4795]: I0320 17:31:44.905484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"863e88b27c645f7b01ba4a1b4c66d16ca94747da1b2a0288c2d28921ca533697"} Mar 20 17:31:44 crc kubenswrapper[4795]: I0320 17:31:44.909325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerStarted","Data":"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5"} Mar 20 17:31:45 crc kubenswrapper[4795]: I0320 17:31:45.920197 4795 generic.go:334] "Generic (PLEG): container finished" podID="596dd5ef-f287-4f26-9618-c7763a911124" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" exitCode=0 Mar 20 17:31:45 crc kubenswrapper[4795]: I0320 17:31:45.920581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5"} Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.307997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.378013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") pod \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.378353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") pod \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.378417 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") pod \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\" (UID: \"cf16b9b7-bdbf-48db-a358-3c32c93b3d43\") " Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.379471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle" (OuterVolumeSpecName: "bundle") pod "cf16b9b7-bdbf-48db-a358-3c32c93b3d43" (UID: "cf16b9b7-bdbf-48db-a358-3c32c93b3d43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.381303 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.387345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv" (OuterVolumeSpecName: "kube-api-access-7rstv") pod "cf16b9b7-bdbf-48db-a358-3c32c93b3d43" (UID: "cf16b9b7-bdbf-48db-a358-3c32c93b3d43"). InnerVolumeSpecName "kube-api-access-7rstv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.484966 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rstv\" (UniqueName: \"kubernetes.io/projected/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-kube-api-access-7rstv\") on node \"crc\" DevicePath \"\"" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.755832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util" (OuterVolumeSpecName: "util") pod "cf16b9b7-bdbf-48db-a358-3c32c93b3d43" (UID: "cf16b9b7-bdbf-48db-a358-3c32c93b3d43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.788722 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf16b9b7-bdbf-48db-a358-3c32c93b3d43-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.928342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerStarted","Data":"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42"} Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.930953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" event={"ID":"cf16b9b7-bdbf-48db-a358-3c32c93b3d43","Type":"ContainerDied","Data":"98d214b944aebd78a2b668652156b847f1c5ec8fc4b459e0c772615f90095c54"} Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.930971 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d214b944aebd78a2b668652156b847f1c5ec8fc4b459e0c772615f90095c54" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.931010 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h" Mar 20 17:31:46 crc kubenswrapper[4795]: I0320 17:31:46.968966 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ccgqp" podStartSLOduration=2.422073167 podStartE2EDuration="4.968947286s" podCreationTimestamp="2026-03-20 17:31:42 +0000 UTC" firstStartedPulling="2026-03-20 17:31:43.892478242 +0000 UTC m=+847.350509783" lastFinishedPulling="2026-03-20 17:31:46.439352331 +0000 UTC m=+849.897383902" observedRunningTime="2026-03-20 17:31:46.96499087 +0000 UTC m=+850.423022411" watchObservedRunningTime="2026-03-20 17:31:46.968947286 +0000 UTC m=+850.426978837" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.067308 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dlcps"] Mar 20 17:31:51 crc kubenswrapper[4795]: E0320 17:31:51.068094 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="extract" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068110 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="extract" Mar 20 17:31:51 crc kubenswrapper[4795]: E0320 17:31:51.068122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="util" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068129 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="util" Mar 20 17:31:51 crc kubenswrapper[4795]: E0320 17:31:51.068142 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="pull" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068149 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="pull" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068285 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf16b9b7-bdbf-48db-a358-3c32c93b3d43" containerName="extract" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.068708 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.072127 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.075857 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.076148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jpd2k" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.077553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dlcps"] Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.250764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqsh\" (UniqueName: \"kubernetes.io/projected/efca4120-31ef-4c52-a6da-59b33144a979-kube-api-access-mjqsh\") pod \"nmstate-operator-796d4cfff4-dlcps\" (UID: \"efca4120-31ef-4c52-a6da-59b33144a979\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.352678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqsh\" (UniqueName: \"kubernetes.io/projected/efca4120-31ef-4c52-a6da-59b33144a979-kube-api-access-mjqsh\") pod \"nmstate-operator-796d4cfff4-dlcps\" (UID: \"efca4120-31ef-4c52-a6da-59b33144a979\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.381312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqsh\" (UniqueName: \"kubernetes.io/projected/efca4120-31ef-4c52-a6da-59b33144a979-kube-api-access-mjqsh\") pod \"nmstate-operator-796d4cfff4-dlcps\" (UID: \"efca4120-31ef-4c52-a6da-59b33144a979\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.423899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.646305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-dlcps"] Mar 20 17:31:51 crc kubenswrapper[4795]: I0320 17:31:51.964697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" event={"ID":"efca4120-31ef-4c52-a6da-59b33144a979","Type":"ContainerStarted","Data":"6c4df5b572666df4d95c2f029bddd98f0cc74c84c6548cc3e6e40e6f5945b36b"} Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.095151 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.095418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.742205 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.743823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.766181 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.884670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.884739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.884967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.986596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.986705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.986728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.987168 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:53 crc kubenswrapper[4795]: I0320 17:31:53.987358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.016795 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"certified-operators-4gbg2\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.076285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.145412 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ccgqp" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" probeResult="failure" output=< Mar 20 17:31:54 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:31:54 crc kubenswrapper[4795]: > Mar 20 17:31:54 crc kubenswrapper[4795]: I0320 17:31:54.542552 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:31:55 crc kubenswrapper[4795]: W0320 17:31:55.330065 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6158d7fa_4982_4817_9c29_b1f3c3fd70d7.slice/crio-306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e WatchSource:0}: Error finding container 306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e: Status 404 returned error can't find the container with id 306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e Mar 20 17:31:55 crc kubenswrapper[4795]: I0320 17:31:55.992290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerStarted","Data":"306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e"} Mar 20 17:31:57 crc kubenswrapper[4795]: I0320 17:31:57.003197 4795 generic.go:334] "Generic (PLEG): container finished" podID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" exitCode=0 Mar 20 17:31:57 crc kubenswrapper[4795]: I0320 17:31:57.003286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4"} Mar 20 17:31:58 crc kubenswrapper[4795]: I0320 17:31:58.012856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" event={"ID":"efca4120-31ef-4c52-a6da-59b33144a979","Type":"ContainerStarted","Data":"e451b6c96025b3dc22e7abc27afcac5348c1901cd3239785abeeac594944b56a"} Mar 20 17:31:58 crc kubenswrapper[4795]: I0320 17:31:58.043515 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-dlcps" podStartSLOduration=1.477015384 podStartE2EDuration="7.043494583s" podCreationTimestamp="2026-03-20 17:31:51 +0000 UTC" firstStartedPulling="2026-03-20 17:31:51.660648104 +0000 UTC m=+855.118679645" lastFinishedPulling="2026-03-20 17:31:57.227127293 +0000 UTC m=+860.685158844" observedRunningTime="2026-03-20 17:31:58.033027801 +0000 UTC m=+861.491059372" watchObservedRunningTime="2026-03-20 17:31:58.043494583 +0000 UTC m=+861.501526124" Mar 20 17:31:59 crc kubenswrapper[4795]: I0320 17:31:59.023092 4795 generic.go:334] "Generic (PLEG): container finished" podID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" exitCode=0 Mar 20 17:31:59 crc kubenswrapper[4795]: I0320 17:31:59.023205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386"} Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.035426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerStarted","Data":"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5"} Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.059299 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gbg2" podStartSLOduration=4.870434044 podStartE2EDuration="7.059272179s" podCreationTimestamp="2026-03-20 17:31:53 +0000 UTC" firstStartedPulling="2026-03-20 17:31:57.218384726 +0000 UTC m=+860.676416287" lastFinishedPulling="2026-03-20 17:31:59.407222851 +0000 UTC m=+862.865254422" observedRunningTime="2026-03-20 17:32:00.057661338 +0000 UTC m=+863.515692899" watchObservedRunningTime="2026-03-20 17:32:00.059272179 +0000 UTC m=+863.517303750" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.146911 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.148033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.151368 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.153103 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.153441 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.156884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.273276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"auto-csr-approver-29567132-b9gh7\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.374292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"auto-csr-approver-29567132-b9gh7\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.395427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"auto-csr-approver-29567132-b9gh7\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.471913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.644471 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.645573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.649846 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bftb5" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.664065 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.667723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.668362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.669932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.674030 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bsp49"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.674959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.685969 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnfh\" (UniqueName: \"kubernetes.io/projected/f50011ef-d180-4d84-ba10-a2da522a579d-kube-api-access-pqnfh\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9fj\" (UniqueName: \"kubernetes.io/projected/e070281f-65f5-4c6d-b012-06c027393646-kube-api-access-ch9fj\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpbn\" (UniqueName: \"kubernetes.io/projected/65c42497-77ba-49bc-a292-5003a353fde6-kube-api-access-whpbn\") pod \"nmstate-metrics-9b8c8685d-xjj2s\" (UID: \"65c42497-77ba-49bc-a292-5003a353fde6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f50011ef-d180-4d84-ba10-a2da522a579d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-ovs-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-nmstate-lock\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.687410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-dbus-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.729811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.780672 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.781310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.783365 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.783541 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.783665 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-79hq8" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f50011ef-d180-4d84-ba10-a2da522a579d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-ovs-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-nmstate-lock\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-dbus-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788480 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnfh\" (UniqueName: \"kubernetes.io/projected/f50011ef-d180-4d84-ba10-a2da522a579d-kube-api-access-pqnfh\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9fj\" (UniqueName: \"kubernetes.io/projected/e070281f-65f5-4c6d-b012-06c027393646-kube-api-access-ch9fj\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpbn\" (UniqueName: \"kubernetes.io/projected/65c42497-77ba-49bc-a292-5003a353fde6-kube-api-access-whpbn\") pod \"nmstate-metrics-9b8c8685d-xjj2s\" (UID: \"65c42497-77ba-49bc-a292-5003a353fde6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.788665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-nmstate-lock\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.789047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-dbus-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.789390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e070281f-65f5-4c6d-b012-06c027393646-ovs-socket\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.790804 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.796997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f50011ef-d180-4d84-ba10-a2da522a579d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.816925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9fj\" (UniqueName: \"kubernetes.io/projected/e070281f-65f5-4c6d-b012-06c027393646-kube-api-access-ch9fj\") pod \"nmstate-handler-bsp49\" (UID: \"e070281f-65f5-4c6d-b012-06c027393646\") " pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.823572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnfh\" (UniqueName: \"kubernetes.io/projected/f50011ef-d180-4d84-ba10-a2da522a579d-kube-api-access-pqnfh\") pod \"nmstate-webhook-5f558f5558-mjhsq\" (UID: \"f50011ef-d180-4d84-ba10-a2da522a579d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.823675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpbn\" (UniqueName: \"kubernetes.io/projected/65c42497-77ba-49bc-a292-5003a353fde6-kube-api-access-whpbn\") pod \"nmstate-metrics-9b8c8685d-xjj2s\" (UID: \"65c42497-77ba-49bc-a292-5003a353fde6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.889925 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nj4\" (UniqueName: \"kubernetes.io/projected/d34761db-41bf-4e5f-bdca-8c25e281c924-kube-api-access-69nj4\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.890344 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34761db-41bf-4e5f-bdca-8c25e281c924-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.890399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34761db-41bf-4e5f-bdca-8c25e281c924-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.970893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5444f47d-jcv9w"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.971521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.981364 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5444f47d-jcv9w"] Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34761db-41bf-4e5f-bdca-8c25e281c924-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-console-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-oauth-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69nj4\" (UniqueName: \"kubernetes.io/projected/d34761db-41bf-4e5f-bdca-8c25e281c924-kube-api-access-69nj4\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996933 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnjm\" (UniqueName: \"kubernetes.io/projected/b27883d5-9c21-4869-b626-3fe39f007913-kube-api-access-ptnjm\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.996990 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-service-ca\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-oauth-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34761db-41bf-4e5f-bdca-8c25e281c924-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-trusted-ca-bundle\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:00 crc kubenswrapper[4795]: I0320 17:32:00.997651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d34761db-41bf-4e5f-bdca-8c25e281c924-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.005749 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d34761db-41bf-4e5f-bdca-8c25e281c924-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.013191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.020941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nj4\" (UniqueName: \"kubernetes.io/projected/d34761db-41bf-4e5f-bdca-8c25e281c924-kube-api-access-69nj4\") pod \"nmstate-console-plugin-86f58fcf4-5jfjl\" (UID: \"d34761db-41bf-4e5f-bdca-8c25e281c924\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.042031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerStarted","Data":"8d336676c584568330dc6906a42fe736bdf03c4eab298e774dfad7ffa35d7551"} Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.042871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:01 crc kubenswrapper[4795]: W0320 17:32:01.067217 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode070281f_65f5_4c6d_b012_06c027393646.slice/crio-dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6 WatchSource:0}: Error finding container dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6: Status 404 returned error can't find the container with id dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6 Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.095564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-console-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-oauth-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnjm\" (UniqueName: \"kubernetes.io/projected/b27883d5-9c21-4869-b626-3fe39f007913-kube-api-access-ptnjm\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-service-ca\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097709 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-oauth-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.097743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-trusted-ca-bundle\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.098577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-console-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.098586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-trusted-ca-bundle\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.098630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-service-ca\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.099131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b27883d5-9c21-4869-b626-3fe39f007913-oauth-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.103098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-oauth-config\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.103921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b27883d5-9c21-4869-b626-3fe39f007913-console-serving-cert\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.116116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnjm\" (UniqueName: \"kubernetes.io/projected/b27883d5-9c21-4869-b626-3fe39f007913-kube-api-access-ptnjm\") pod \"console-c5444f47d-jcv9w\" (UID: \"b27883d5-9c21-4869-b626-3fe39f007913\") " pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.231457 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s"] Mar 20 17:32:01 crc kubenswrapper[4795]: W0320 17:32:01.241425 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c42497_77ba_49bc_a292_5003a353fde6.slice/crio-996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b WatchSource:0}: Error finding container 996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b: Status 404 returned error can't find the container with id 996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.263700 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq"] Mar 20 17:32:01 crc kubenswrapper[4795]: W0320 17:32:01.281180 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50011ef_d180_4d84_ba10_a2da522a579d.slice/crio-f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5 WatchSource:0}: Error finding container f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5: Status 404 returned error can't find the container with id f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5 Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.297156 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.306512 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl"] Mar 20 17:32:01 crc kubenswrapper[4795]: I0320 17:32:01.700932 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5444f47d-jcv9w"] Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.049927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bsp49" event={"ID":"e070281f-65f5-4c6d-b012-06c027393646","Type":"ContainerStarted","Data":"dec3f3398a33fe5be4e6458f497ee4009f7aa21becff93bc1b7bbd2cfe2b13c6"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.052978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" event={"ID":"d34761db-41bf-4e5f-bdca-8c25e281c924","Type":"ContainerStarted","Data":"acedf09b035200d8295f483704044692e759e05b7a842a68c484ac0d84c79f71"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.054728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" event={"ID":"65c42497-77ba-49bc-a292-5003a353fde6","Type":"ContainerStarted","Data":"996eb0156974055a7a10ec98e543cfa3f58538e492e34858aebe04a81d8ab93b"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.056717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5444f47d-jcv9w" event={"ID":"b27883d5-9c21-4869-b626-3fe39f007913","Type":"ContainerStarted","Data":"2717b812a369ff2db01ee54aae4f9812df13ad6781d579c7086699913a6bb582"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.056751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5444f47d-jcv9w" event={"ID":"b27883d5-9c21-4869-b626-3fe39f007913","Type":"ContainerStarted","Data":"0e72f9902ef5f4ecebae0b68c201fabd21f682e9a69fe3297d68ace71703ce51"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.060510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" event={"ID":"f50011ef-d180-4d84-ba10-a2da522a579d","Type":"ContainerStarted","Data":"f59db0b8777d8cdfc3864460fbcedf8f524e4fddadaa4889a1471603d5aa37c5"} Mar 20 17:32:02 crc kubenswrapper[4795]: I0320 17:32:02.091577 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5444f47d-jcv9w" podStartSLOduration=2.091547919 podStartE2EDuration="2.091547919s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:32:02.083227655 +0000 UTC m=+865.541259226" watchObservedRunningTime="2026-03-20 17:32:02.091547919 +0000 UTC m=+865.549579490" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.072212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerStarted","Data":"616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20"} Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.096422 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" podStartSLOduration=1.238870298 podStartE2EDuration="3.096398445s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:00.725915691 +0000 UTC m=+864.183947222" lastFinishedPulling="2026-03-20 17:32:02.583443788 +0000 UTC m=+866.041475369" observedRunningTime="2026-03-20 17:32:03.088900618 +0000 UTC m=+866.546932199" watchObservedRunningTime="2026-03-20 17:32:03.096398445 +0000 UTC m=+866.554430016" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.154388 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.198276 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:03 crc kubenswrapper[4795]: I0320 17:32:03.927089 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.077218 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.077276 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.079798 4795 generic.go:334] "Generic (PLEG): container finished" podID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerID="616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20" exitCode=0 Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.079832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerDied","Data":"616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20"} Mar 20 17:32:04 crc kubenswrapper[4795]: I0320 17:32:04.130467 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.086602 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ccgqp" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" containerID="cri-o://2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" gracePeriod=2 Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.131641 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.581113 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.665770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") pod \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\" (UID: \"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.676225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj" (OuterVolumeSpecName: "kube-api-access-wcvpj") pod "9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" (UID: "9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd"). InnerVolumeSpecName "kube-api-access-wcvpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.727720 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.766900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") pod \"596dd5ef-f287-4f26-9618-c7763a911124\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.767146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") pod \"596dd5ef-f287-4f26-9618-c7763a911124\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.767260 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") pod \"596dd5ef-f287-4f26-9618-c7763a911124\" (UID: \"596dd5ef-f287-4f26-9618-c7763a911124\") " Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.767580 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvpj\" (UniqueName: \"kubernetes.io/projected/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd-kube-api-access-wcvpj\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.768060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities" (OuterVolumeSpecName: "utilities") pod "596dd5ef-f287-4f26-9618-c7763a911124" (UID: "596dd5ef-f287-4f26-9618-c7763a911124"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.770480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4" (OuterVolumeSpecName: "kube-api-access-rtnk4") pod "596dd5ef-f287-4f26-9618-c7763a911124" (UID: "596dd5ef-f287-4f26-9618-c7763a911124"). InnerVolumeSpecName "kube-api-access-rtnk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.869605 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.869642 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtnk4\" (UniqueName: \"kubernetes.io/projected/596dd5ef-f287-4f26-9618-c7763a911124-kube-api-access-rtnk4\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.902073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "596dd5ef-f287-4f26-9618-c7763a911124" (UID: "596dd5ef-f287-4f26-9618-c7763a911124"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:05 crc kubenswrapper[4795]: I0320 17:32:05.971022 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596dd5ef-f287-4f26-9618-c7763a911124-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.093260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" event={"ID":"65c42497-77ba-49bc-a292-5003a353fde6","Type":"ContainerStarted","Data":"ea6461fdf384c7fdde5d1bc30899be010563aaeb383147b68ad8ed612f14c3f1"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.095418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" event={"ID":"f50011ef-d180-4d84-ba10-a2da522a579d","Type":"ContainerStarted","Data":"d459d76826aed1acb3ecbfc9d943cb5cb14adf4e987d31265355842277ea08c1"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.095486 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099792 4795 generic.go:334] "Generic (PLEG): container finished" podID="596dd5ef-f287-4f26-9618-c7763a911124" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" exitCode=0 Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099820 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgqp" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgqp" event={"ID":"596dd5ef-f287-4f26-9618-c7763a911124","Type":"ContainerDied","Data":"6ecffaee968618ba46837c7d9197230f37328559d4478e137afa21294923f290"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.099920 4795 scope.go:117] "RemoveContainer" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.101321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bsp49" event={"ID":"e070281f-65f5-4c6d-b012-06c027393646","Type":"ContainerStarted","Data":"a07bc285a6c4703cf239433a163e3c24d27e03d403d782478573d52aed067438"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.101828 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.108298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" event={"ID":"9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd","Type":"ContainerDied","Data":"8d336676c584568330dc6906a42fe736bdf03c4eab298e774dfad7ffa35d7551"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.108332 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d336676c584568330dc6906a42fe736bdf03c4eab298e774dfad7ffa35d7551" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.108387 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-b9gh7" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.112213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" event={"ID":"d34761db-41bf-4e5f-bdca-8c25e281c924","Type":"ContainerStarted","Data":"c3c19082ec0adf9bc8e7dde9f17563e5d6f70e498a384dd9ebe4a25e0784f8c9"} Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.121818 4795 scope.go:117] "RemoveContainer" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.121372 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" podStartSLOduration=1.715865704 podStartE2EDuration="6.121350615s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.282316575 +0000 UTC m=+864.740348116" lastFinishedPulling="2026-03-20 17:32:05.687801476 +0000 UTC m=+869.145833027" observedRunningTime="2026-03-20 17:32:06.117480723 +0000 UTC m=+869.575512274" watchObservedRunningTime="2026-03-20 17:32:06.121350615 +0000 UTC m=+869.579382156" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.157514 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.160009 4795 scope.go:117] "RemoveContainer" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.171442 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-nhz8w"] Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.176508 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5jfjl" podStartSLOduration=1.961529145 podStartE2EDuration="6.176493245s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.323246933 +0000 UTC m=+864.781278464" lastFinishedPulling="2026-03-20 17:32:05.538210983 +0000 UTC m=+868.996242564" observedRunningTime="2026-03-20 17:32:06.167572991 +0000 UTC m=+869.625604542" watchObservedRunningTime="2026-03-20 17:32:06.176493245 +0000 UTC m=+869.634524786" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.202201 4795 scope.go:117] "RemoveContainer" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" Mar 20 17:32:06 crc kubenswrapper[4795]: E0320 17:32:06.202789 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42\": container with ID starting with 2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42 not found: ID does not exist" containerID="2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.202819 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42"} err="failed to get container status \"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42\": rpc error: code = NotFound desc = could not find container \"2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42\": container with ID starting with 2f7c09da4d16e6d358cc9f73e0f7bfe8ae246c34b871a2f3abc3b43e723beb42 not found: ID does not exist" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.202839 4795 scope.go:117] "RemoveContainer" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" Mar 20 17:32:06 crc kubenswrapper[4795]: E0320 17:32:06.203635 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5\": container with ID starting with fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5 not found: ID does not exist" containerID="fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.203666 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5"} err="failed to get container status \"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5\": rpc error: code = NotFound desc = could not find container \"fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5\": container with ID starting with fcd7ef1e90db8316c4c1bb194d33a7e7034542d99d3620c859e836571919a4f5 not found: ID does not exist" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.203710 4795 scope.go:117] "RemoveContainer" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.208288 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bsp49" podStartSLOduration=1.583703541 podStartE2EDuration="6.208265651s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.070988923 +0000 UTC m=+864.529020464" lastFinishedPulling="2026-03-20 17:32:05.695551023 +0000 UTC m=+869.153582574" observedRunningTime="2026-03-20 17:32:06.189409444 +0000 UTC m=+869.647440985" watchObservedRunningTime="2026-03-20 17:32:06.208265651 +0000 UTC m=+869.666297192" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.210728 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:32:06 crc kubenswrapper[4795]: E0320 17:32:06.213576 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d\": container with ID starting with b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d not found: ID does not exist" containerID="b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.213635 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d"} err="failed to get container status \"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d\": rpc error: code = NotFound desc = could not find container \"b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d\": container with ID starting with b1e29d6e7354f7c8c2cd941aaf4fa1fa31accd680d743bf1fbfa3dfb1770e95d not found: ID does not exist" Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.214022 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ccgqp"] Mar 20 17:32:06 crc kubenswrapper[4795]: I0320 17:32:06.535056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.133154 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gbg2" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" containerID="cri-o://9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" gracePeriod=2 Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.264281 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596dd5ef-f287-4f26-9618-c7763a911124" path="/var/lib/kubelet/pods/596dd5ef-f287-4f26-9618-c7763a911124/volumes" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.265718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740c1ddf-96e5-46f6-837c-73372748464e" path="/var/lib/kubelet/pods/740c1ddf-96e5-46f6-837c-73372748464e/volumes" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.516071 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.594934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") pod \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.595004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") pod \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.595030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") pod \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\" (UID: \"6158d7fa-4982-4817-9c29-b1f3c3fd70d7\") " Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.596474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities" (OuterVolumeSpecName: "utilities") pod "6158d7fa-4982-4817-9c29-b1f3c3fd70d7" (UID: "6158d7fa-4982-4817-9c29-b1f3c3fd70d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.604586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr" (OuterVolumeSpecName: "kube-api-access-hr5sr") pod "6158d7fa-4982-4817-9c29-b1f3c3fd70d7" (UID: "6158d7fa-4982-4817-9c29-b1f3c3fd70d7"). InnerVolumeSpecName "kube-api-access-hr5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.672233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6158d7fa-4982-4817-9c29-b1f3c3fd70d7" (UID: "6158d7fa-4982-4817-9c29-b1f3c3fd70d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.696674 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.696870 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:07 crc kubenswrapper[4795]: I0320 17:32:07.696882 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5sr\" (UniqueName: \"kubernetes.io/projected/6158d7fa-4982-4817-9c29-b1f3c3fd70d7-kube-api-access-hr5sr\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.142890 4795 generic.go:334] "Generic (PLEG): container finished" podID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" exitCode=0 Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.142966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5"} Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.142995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gbg2" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.143022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gbg2" event={"ID":"6158d7fa-4982-4817-9c29-b1f3c3fd70d7","Type":"ContainerDied","Data":"306762ba2f3a916ee3f297df221a3a99b8f647575a6fbd57139b3caf80350c2e"} Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.143049 4795 scope.go:117] "RemoveContainer" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.177953 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.183189 4795 scope.go:117] "RemoveContainer" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.186369 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gbg2"] Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.213750 4795 scope.go:117] "RemoveContainer" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.236291 4795 scope.go:117] "RemoveContainer" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" Mar 20 17:32:08 crc kubenswrapper[4795]: E0320 17:32:08.236867 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5\": container with ID starting with 9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5 not found: ID does not exist" containerID="9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.236974 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5"} err="failed to get container status \"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5\": rpc error: code = NotFound desc = could not find container \"9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5\": container with ID starting with 9bc6e21fd4da8952b68ebd3f8b9d3b0dc9db29a59ade2c56b165a1053f3c0ff5 not found: ID does not exist" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.237028 4795 scope.go:117] "RemoveContainer" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" Mar 20 17:32:08 crc kubenswrapper[4795]: E0320 17:32:08.237482 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386\": container with ID starting with 1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386 not found: ID does not exist" containerID="1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.237546 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386"} err="failed to get container status \"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386\": rpc error: code = NotFound desc = could not find container \"1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386\": container with ID starting with 1935c3a75c27304cf9122405db97dde8ac420d5f532c260b98121e6d84acd386 not found: ID does not exist" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.237587 4795 scope.go:117] "RemoveContainer" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" Mar 20 17:32:08 crc kubenswrapper[4795]: E0320 17:32:08.238096 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4\": container with ID starting with 341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4 not found: ID does not exist" containerID="341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4" Mar 20 17:32:08 crc kubenswrapper[4795]: I0320 17:32:08.238134 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4"} err="failed to get container status \"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4\": rpc error: code = NotFound desc = could not find container \"341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4\": container with ID starting with 341555f5f0c0bae25672e8ec9663f80e9d3defa977ded17387b3d01fd8c553b4 not found: ID does not exist" Mar 20 17:32:09 crc kubenswrapper[4795]: I0320 17:32:09.154058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" event={"ID":"65c42497-77ba-49bc-a292-5003a353fde6","Type":"ContainerStarted","Data":"f83c019099cbb0a94a45c2f70d6c8fc2b9bda454b59ae1fe43d96c2782fd7021"} Mar 20 17:32:09 crc kubenswrapper[4795]: I0320 17:32:09.270541 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" path="/var/lib/kubelet/pods/6158d7fa-4982-4817-9c29-b1f3c3fd70d7/volumes" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.086784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bsp49" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.115544 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-xjj2s" podStartSLOduration=4.172647777 podStartE2EDuration="11.115519436s" podCreationTimestamp="2026-03-20 17:32:00 +0000 UTC" firstStartedPulling="2026-03-20 17:32:01.244449974 +0000 UTC m=+864.702481505" lastFinishedPulling="2026-03-20 17:32:08.187321623 +0000 UTC m=+871.645353164" observedRunningTime="2026-03-20 17:32:09.193535204 +0000 UTC m=+872.651566795" watchObservedRunningTime="2026-03-20 17:32:11.115519436 +0000 UTC m=+874.573551007" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.298294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.298371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.301469 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.301535 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.301587 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.302249 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.302353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842" gracePeriod=600 Mar 20 17:32:11 crc kubenswrapper[4795]: I0320 17:32:11.305490 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191457 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842" exitCode=0 Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842"} Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc"} Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.191938 4795 scope.go:117] "RemoveContainer" containerID="ed9bda6f42a6d288b35bf70696e5594aa2925d61c10f456b31f21f44bcd49f2f" Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.199209 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5444f47d-jcv9w" Mar 20 17:32:12 crc kubenswrapper[4795]: I0320 17:32:12.292711 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.940446 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941434 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerName="oc" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerName="oc" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941469 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941477 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941493 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941500 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941518 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941525 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941539 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941547 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941557 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941564 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="extract-content" Mar 20 17:32:15 crc kubenswrapper[4795]: E0320 17:32:15.941582 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941592 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="extract-utilities" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941775 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="596dd5ef-f287-4f26-9618-c7763a911124" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941796 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" containerName="oc" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.941808 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6158d7fa-4982-4817-9c29-b1f3c3fd70d7" containerName="registry-server" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.942727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:15 crc kubenswrapper[4795]: I0320 17:32:15.958991 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.025318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.025442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.025524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.126878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.126935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.126978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.127547 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.127776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.161890 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"community-operators-sqkkm\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.263000 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:16 crc kubenswrapper[4795]: I0320 17:32:16.494906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:16 crc kubenswrapper[4795]: W0320 17:32:16.501290 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7dec918_b9ad_46a4_b161_6006552b910e.slice/crio-ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08 WatchSource:0}: Error finding container ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08: Status 404 returned error can't find the container with id ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08 Mar 20 17:32:17 crc kubenswrapper[4795]: I0320 17:32:17.237064 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7dec918-b9ad-46a4-b161-6006552b910e" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" exitCode=0 Mar 20 17:32:17 crc kubenswrapper[4795]: I0320 17:32:17.237132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d"} Mar 20 17:32:17 crc kubenswrapper[4795]: I0320 17:32:17.237173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerStarted","Data":"ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08"} Mar 20 17:32:18 crc kubenswrapper[4795]: I0320 17:32:18.249290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerStarted","Data":"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733"} Mar 20 17:32:19 crc kubenswrapper[4795]: I0320 17:32:19.260133 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7dec918-b9ad-46a4-b161-6006552b910e" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" exitCode=0 Mar 20 17:32:19 crc kubenswrapper[4795]: I0320 17:32:19.271538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733"} Mar 20 17:32:21 crc kubenswrapper[4795]: I0320 17:32:21.023755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mjhsq" Mar 20 17:32:21 crc kubenswrapper[4795]: I0320 17:32:21.278210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerStarted","Data":"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e"} Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.263977 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.264405 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.334580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.362236 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqkkm" podStartSLOduration=8.299844587 podStartE2EDuration="11.362218284s" podCreationTimestamp="2026-03-20 17:32:15 +0000 UTC" firstStartedPulling="2026-03-20 17:32:17.240622061 +0000 UTC m=+880.698653632" lastFinishedPulling="2026-03-20 17:32:20.302995788 +0000 UTC m=+883.761027329" observedRunningTime="2026-03-20 17:32:21.301958988 +0000 UTC m=+884.759990569" watchObservedRunningTime="2026-03-20 17:32:26.362218284 +0000 UTC m=+889.820249835" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.386985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:26 crc kubenswrapper[4795]: I0320 17:32:26.576712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:28 crc kubenswrapper[4795]: I0320 17:32:28.335474 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sqkkm" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" containerID="cri-o://054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" gracePeriod=2 Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.074968 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.166644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") pod \"c7dec918-b9ad-46a4-b161-6006552b910e\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.166781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") pod \"c7dec918-b9ad-46a4-b161-6006552b910e\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.168363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities" (OuterVolumeSpecName: "utilities") pod "c7dec918-b9ad-46a4-b161-6006552b910e" (UID: "c7dec918-b9ad-46a4-b161-6006552b910e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.172958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") pod \"c7dec918-b9ad-46a4-b161-6006552b910e\" (UID: \"c7dec918-b9ad-46a4-b161-6006552b910e\") " Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.173566 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.178927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb" (OuterVolumeSpecName: "kube-api-access-t6crb") pod "c7dec918-b9ad-46a4-b161-6006552b910e" (UID: "c7dec918-b9ad-46a4-b161-6006552b910e"). InnerVolumeSpecName "kube-api-access-t6crb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.227169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7dec918-b9ad-46a4-b161-6006552b910e" (UID: "c7dec918-b9ad-46a4-b161-6006552b910e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.274597 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dec918-b9ad-46a4-b161-6006552b910e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.274648 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6crb\" (UniqueName: \"kubernetes.io/projected/c7dec918-b9ad-46a4-b161-6006552b910e-kube-api-access-t6crb\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353039 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7dec918-b9ad-46a4-b161-6006552b910e" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" exitCode=0 Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e"} Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353141 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqkkm" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqkkm" event={"ID":"c7dec918-b9ad-46a4-b161-6006552b910e","Type":"ContainerDied","Data":"ac625ba6991f66b05a7da35ba3253419b0714209879d9705988d2d55f59c5e08"} Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.353234 4795 scope.go:117] "RemoveContainer" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.389168 4795 scope.go:117] "RemoveContainer" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.414448 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.419405 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sqkkm"] Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.423793 4795 scope.go:117] "RemoveContainer" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.450401 4795 scope.go:117] "RemoveContainer" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" Mar 20 17:32:30 crc kubenswrapper[4795]: E0320 17:32:30.450807 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e\": container with ID starting with 054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e not found: ID does not exist" containerID="054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.450857 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e"} err="failed to get container status \"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e\": rpc error: code = NotFound desc = could not find container \"054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e\": container with ID starting with 054097e58b634b190b31653bd4364391b397334567a0596926b99ceeb750191e not found: ID does not exist" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.450925 4795 scope.go:117] "RemoveContainer" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" Mar 20 17:32:30 crc kubenswrapper[4795]: E0320 17:32:30.452066 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733\": container with ID starting with 8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733 not found: ID does not exist" containerID="8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.452169 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733"} err="failed to get container status \"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733\": rpc error: code = NotFound desc = could not find container \"8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733\": container with ID starting with 8c23186907d0617e05f69640210644311057f189040853ec43f7955efee8c733 not found: ID does not exist" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.452194 4795 scope.go:117] "RemoveContainer" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" Mar 20 17:32:30 crc kubenswrapper[4795]: E0320 17:32:30.452601 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d\": container with ID starting with 07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d not found: ID does not exist" containerID="07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d" Mar 20 17:32:30 crc kubenswrapper[4795]: I0320 17:32:30.452626 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d"} err="failed to get container status \"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d\": rpc error: code = NotFound desc = could not find container \"07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d\": container with ID starting with 07fb10116862f8c6cf3b4057db7a697551756490e25106be014d03db1dc1314d not found: ID does not exist" Mar 20 17:32:31 crc kubenswrapper[4795]: I0320 17:32:31.279854 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" path="/var/lib/kubelet/pods/c7dec918-b9ad-46a4-b161-6006552b910e/volumes" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.764556 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk"] Mar 20 17:32:35 crc kubenswrapper[4795]: E0320 17:32:35.765587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765605 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" Mar 20 17:32:35 crc kubenswrapper[4795]: E0320 17:32:35.765644 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-utilities" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765653 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-utilities" Mar 20 17:32:35 crc kubenswrapper[4795]: E0320 17:32:35.765671 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-content" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="extract-content" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.765974 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7dec918-b9ad-46a4-b161-6006552b910e" containerName="registry-server" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.766996 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.769472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.780372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk"] Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.853882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.854305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.854480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.956112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.956219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.956261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.957089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.957489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:35 crc kubenswrapper[4795]: I0320 17:32:35.991510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:36 crc kubenswrapper[4795]: I0320 17:32:36.094840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:36 crc kubenswrapper[4795]: I0320 17:32:36.347031 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk"] Mar 20 17:32:36 crc kubenswrapper[4795]: I0320 17:32:36.399304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerStarted","Data":"7637a791c41bad7c601738f0094c7d42302c4b652cd02e91a4b24a6955f0768a"} Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.352769 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hn4r8" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" containerID="cri-o://1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" gracePeriod=15 Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.410401 4795 generic.go:334] "Generic (PLEG): container finished" podID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerID="88c4d3a2ad80e2fc476afe69ca64363007aa94b8e2632cdb73754ffffcfc98b5" exitCode=0 Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.410470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"88c4d3a2ad80e2fc476afe69ca64363007aa94b8e2632cdb73754ffffcfc98b5"} Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.769014 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hn4r8_662f8843-e25d-48ce-989d-9ea05937757d/console/0.log" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.769124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.880603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.880756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.880818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.881887 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.881982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config" (OuterVolumeSpecName: "console-config") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882228 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") pod \"662f8843-e25d-48ce-989d-9ea05937757d\" (UID: \"662f8843-e25d-48ce-989d-9ea05937757d\") " Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.882623 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.883218 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.883268 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.883926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca" (OuterVolumeSpecName: "service-ca") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.884056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.887206 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.887816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm" (OuterVolumeSpecName: "kube-api-access-5dskm") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "kube-api-access-5dskm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.891762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "662f8843-e25d-48ce-989d-9ea05937757d" (UID: "662f8843-e25d-48ce-989d-9ea05937757d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984398 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dskm\" (UniqueName: \"kubernetes.io/projected/662f8843-e25d-48ce-989d-9ea05937757d-kube-api-access-5dskm\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984453 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984462 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984470 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/662f8843-e25d-48ce-989d-9ea05937757d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:37 crc kubenswrapper[4795]: I0320 17:32:37.984479 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/662f8843-e25d-48ce-989d-9ea05937757d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.421882 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hn4r8_662f8843-e25d-48ce-989d-9ea05937757d/console/0.log" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422389 4795 generic.go:334] "Generic (PLEG): container finished" podID="662f8843-e25d-48ce-989d-9ea05937757d" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" exitCode=2 Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerDied","Data":"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0"} Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hn4r8" event={"ID":"662f8843-e25d-48ce-989d-9ea05937757d","Type":"ContainerDied","Data":"c7f65d1274bb19079f9f79351a782d7495541a1ecdc8d88a866af54812721807"} Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422521 4795 scope.go:117] "RemoveContainer" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.422784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hn4r8" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.456493 4795 scope.go:117] "RemoveContainer" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" Mar 20 17:32:38 crc kubenswrapper[4795]: E0320 17:32:38.457185 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0\": container with ID starting with 1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0 not found: ID does not exist" containerID="1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.457253 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0"} err="failed to get container status \"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0\": rpc error: code = NotFound desc = could not find container \"1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0\": container with ID starting with 1a7702f7f750941cb60d392fa8c8e3bd4d4232cc05ee10b9eef66c38eff099c0 not found: ID does not exist" Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.488001 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:32:38 crc kubenswrapper[4795]: I0320 17:32:38.496230 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hn4r8"] Mar 20 17:32:39 crc kubenswrapper[4795]: I0320 17:32:39.265220 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="662f8843-e25d-48ce-989d-9ea05937757d" path="/var/lib/kubelet/pods/662f8843-e25d-48ce-989d-9ea05937757d/volumes" Mar 20 17:32:40 crc kubenswrapper[4795]: I0320 17:32:40.442501 4795 generic.go:334] "Generic (PLEG): container finished" podID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerID="e8e016cf54131355bcae914bb33679be4df5b923faa8aa326ed84ef75f46b216" exitCode=0 Mar 20 17:32:40 crc kubenswrapper[4795]: I0320 17:32:40.442590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"e8e016cf54131355bcae914bb33679be4df5b923faa8aa326ed84ef75f46b216"} Mar 20 17:32:41 crc kubenswrapper[4795]: I0320 17:32:41.454679 4795 generic.go:334] "Generic (PLEG): container finished" podID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerID="a4557c0a88edbaa0a8b87f75e4839dd3dcfda99e54e04b9994d601e11f01d82b" exitCode=0 Mar 20 17:32:41 crc kubenswrapper[4795]: I0320 17:32:41.455048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"a4557c0a88edbaa0a8b87f75e4839dd3dcfda99e54e04b9994d601e11f01d82b"} Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.850938 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.957601 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") pod \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.957676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") pod \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.957763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") pod \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\" (UID: \"6d525cd0-41ce-4352-8ce0-8f24113c89d0\") " Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.959113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle" (OuterVolumeSpecName: "bundle") pod "6d525cd0-41ce-4352-8ce0-8f24113c89d0" (UID: "6d525cd0-41ce-4352-8ce0-8f24113c89d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.964518 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st" (OuterVolumeSpecName: "kube-api-access-4m4st") pod "6d525cd0-41ce-4352-8ce0-8f24113c89d0" (UID: "6d525cd0-41ce-4352-8ce0-8f24113c89d0"). InnerVolumeSpecName "kube-api-access-4m4st". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4795]: I0320 17:32:42.972525 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util" (OuterVolumeSpecName: "util") pod "6d525cd0-41ce-4352-8ce0-8f24113c89d0" (UID: "6d525cd0-41ce-4352-8ce0-8f24113c89d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.060009 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.060069 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d525cd0-41ce-4352-8ce0-8f24113c89d0-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.060135 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m4st\" (UniqueName: \"kubernetes.io/projected/6d525cd0-41ce-4352-8ce0-8f24113c89d0-kube-api-access-4m4st\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.471424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" event={"ID":"6d525cd0-41ce-4352-8ce0-8f24113c89d0","Type":"ContainerDied","Data":"7637a791c41bad7c601738f0094c7d42302c4b652cd02e91a4b24a6955f0768a"} Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.471497 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7637a791c41bad7c601738f0094c7d42302c4b652cd02e91a4b24a6955f0768a" Mar 20 17:32:43 crc kubenswrapper[4795]: I0320 17:32:43.471566 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk" Mar 20 17:32:49 crc kubenswrapper[4795]: I0320 17:32:49.374760 4795 scope.go:117] "RemoveContainer" containerID="ea095688dd8877661afbf85ce172a04981e2524e4cbc5e45ea0fa637fadfbc39" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.886475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj"] Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887308 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887325 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887345 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="extract" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887353 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="extract" Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="util" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887374 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="util" Mar 20 17:32:53 crc kubenswrapper[4795]: E0320 17:32:53.887384 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="pull" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887392 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="pull" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887513 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d525cd0-41ce-4352-8ce0-8f24113c89d0" containerName="extract" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.887539 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="662f8843-e25d-48ce-989d-9ea05937757d" containerName="console" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.888079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.893203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.893353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.894152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.894203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.894424 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2zg9w" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.904179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj"] Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.915359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-apiservice-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.915424 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-webhook-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:53 crc kubenswrapper[4795]: I0320 17:32:53.915502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mf5\" (UniqueName: \"kubernetes.io/projected/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-kube-api-access-s7mf5\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.016713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mf5\" (UniqueName: \"kubernetes.io/projected/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-kube-api-access-s7mf5\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.016770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-apiservice-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.016806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-webhook-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.026435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-apiservice-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.026442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-webhook-cert\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.036153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mf5\" (UniqueName: \"kubernetes.io/projected/0e8dba8d-8387-4ced-ac54-b8d5e1cf3650-kube-api-access-s7mf5\") pod \"metallb-operator-controller-manager-7547f4d8c8-499mj\" (UID: \"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650\") " pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.133449 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn"] Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.134238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.137245 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.137500 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.137936 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2bb4s" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.151253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn"] Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.204513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.387790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvn5f\" (UniqueName: \"kubernetes.io/projected/2d29ac93-da31-4834-a858-d5bd9adb28d1-kube-api-access-pvn5f\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.387828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-apiservice-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.387872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-webhook-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.492229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-webhook-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.492627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvn5f\" (UniqueName: \"kubernetes.io/projected/2d29ac93-da31-4834-a858-d5bd9adb28d1-kube-api-access-pvn5f\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.492668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-apiservice-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.497363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-apiservice-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.503267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d29ac93-da31-4834-a858-d5bd9adb28d1-webhook-cert\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.516016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvn5f\" (UniqueName: \"kubernetes.io/projected/2d29ac93-da31-4834-a858-d5bd9adb28d1-kube-api-access-pvn5f\") pod \"metallb-operator-webhook-server-5ffc48dc7-t9vwn\" (UID: \"2d29ac93-da31-4834-a858-d5bd9adb28d1\") " pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.682946 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj"] Mar 20 17:32:54 crc kubenswrapper[4795]: W0320 17:32:54.691303 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e8dba8d_8387_4ced_ac54_b8d5e1cf3650.slice/crio-0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631 WatchSource:0}: Error finding container 0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631: Status 404 returned error can't find the container with id 0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631 Mar 20 17:32:54 crc kubenswrapper[4795]: I0320 17:32:54.749570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:55 crc kubenswrapper[4795]: I0320 17:32:55.180104 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn"] Mar 20 17:32:55 crc kubenswrapper[4795]: W0320 17:32:55.184245 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d29ac93_da31_4834_a858_d5bd9adb28d1.slice/crio-f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce WatchSource:0}: Error finding container f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce: Status 404 returned error can't find the container with id f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce Mar 20 17:32:55 crc kubenswrapper[4795]: I0320 17:32:55.569587 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" event={"ID":"2d29ac93-da31-4834-a858-d5bd9adb28d1","Type":"ContainerStarted","Data":"f96f11e17a4f39721398b1877a37bdda5943061dd228d066737a9fd2f9b31cce"} Mar 20 17:32:55 crc kubenswrapper[4795]: I0320 17:32:55.571105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" event={"ID":"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650","Type":"ContainerStarted","Data":"0215583718df8f90686f9f9d510b84b56d69b9e2094799ec63df11fbc2fc0631"} Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.601738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" event={"ID":"0e8dba8d-8387-4ced-ac54-b8d5e1cf3650","Type":"ContainerStarted","Data":"f91d74b3e73995b5f9612f408ecd9f5188f94046424f89e59f41879694428f17"} Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.602615 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.610078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" event={"ID":"2d29ac93-da31-4834-a858-d5bd9adb28d1","Type":"ContainerStarted","Data":"7db33e2ff1b9b090bbdefd1d2955608a95dcea00951a1a56f2ad252b0294020a"} Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.610896 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.678289 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" podStartSLOduration=2.234754039 podStartE2EDuration="6.678268596s" podCreationTimestamp="2026-03-20 17:32:53 +0000 UTC" firstStartedPulling="2026-03-20 17:32:54.695714285 +0000 UTC m=+918.153745826" lastFinishedPulling="2026-03-20 17:32:59.139228842 +0000 UTC m=+922.597260383" observedRunningTime="2026-03-20 17:32:59.640646083 +0000 UTC m=+923.098677624" watchObservedRunningTime="2026-03-20 17:32:59.678268596 +0000 UTC m=+923.136300137" Mar 20 17:32:59 crc kubenswrapper[4795]: I0320 17:32:59.679737 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" podStartSLOduration=1.638526994 podStartE2EDuration="5.679728493s" podCreationTimestamp="2026-03-20 17:32:54 +0000 UTC" firstStartedPulling="2026-03-20 17:32:55.190715762 +0000 UTC m=+918.648747303" lastFinishedPulling="2026-03-20 17:32:59.231917261 +0000 UTC m=+922.689948802" observedRunningTime="2026-03-20 17:32:59.667331109 +0000 UTC m=+923.125362650" watchObservedRunningTime="2026-03-20 17:32:59.679728493 +0000 UTC m=+923.137760044" Mar 20 17:33:14 crc kubenswrapper[4795]: I0320 17:33:14.754304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5ffc48dc7-t9vwn" Mar 20 17:33:34 crc kubenswrapper[4795]: I0320 17:33:34.207073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7547f4d8c8-499mj" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.001556 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-66lbd"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.020467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.026986 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.027604 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.028298 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.028848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kv4mk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.029178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.030439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.034079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.093315 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bl9qp"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.094212 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.096811 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.096967 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwrjm" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.097115 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.097243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.100913 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-kvtc5"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.102136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.104070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.119723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kvtc5"] Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-reloader\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-conf\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-sockets\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4vm\" (UniqueName: \"kubernetes.io/projected/a748ee28-0a26-4700-b384-3afa65b8ac9d-kube-api-access-cw4vm\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics-certs\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrdt\" (UniqueName: \"kubernetes.io/projected/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-kube-api-access-cfrdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.156473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-startup\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-metrics-certs\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4qsm\" (UniqueName: \"kubernetes.io/projected/2ce06e1f-5454-4b85-888b-3230c0086c2e-kube-api-access-z4qsm\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-conf\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-sockets\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4vm\" (UniqueName: \"kubernetes.io/projected/a748ee28-0a26-4700-b384-3afa65b8ac9d-kube-api-access-cw4vm\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8834c8fc-36f7-41da-867f-ec5a32e25b36-metallb-excludel2\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics-certs\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbg6h\" (UniqueName: \"kubernetes.io/projected/8834c8fc-36f7-41da-867f-ec5a32e25b36-kube-api-access-wbg6h\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrdt\" (UniqueName: \"kubernetes.io/projected/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-kube-api-access-cfrdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257671 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-startup\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257722 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-cert\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-sockets\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-metrics-certs\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-reloader\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.257949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-conf\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.258204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a748ee28-0a26-4700-b384-3afa65b8ac9d-reloader\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.258707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a748ee28-0a26-4700-b384-3afa65b8ac9d-frr-startup\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.263545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.274402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a748ee28-0a26-4700-b384-3afa65b8ac9d-metrics-certs\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.274450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrdt\" (UniqueName: \"kubernetes.io/projected/377dbbb7-0571-40cd-9fe3-3c86fbf4f092-kube-api-access-cfrdt\") pod \"frr-k8s-webhook-server-bcc4b6f68-jw5dk\" (UID: \"377dbbb7-0571-40cd-9fe3-3c86fbf4f092\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.276841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4vm\" (UniqueName: \"kubernetes.io/projected/a748ee28-0a26-4700-b384-3afa65b8ac9d-kube-api-access-cw4vm\") pod \"frr-k8s-66lbd\" (UID: \"a748ee28-0a26-4700-b384-3afa65b8ac9d\") " pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.349081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.355528 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8834c8fc-36f7-41da-867f-ec5a32e25b36-metallb-excludel2\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359176 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbg6h\" (UniqueName: \"kubernetes.io/projected/8834c8fc-36f7-41da-867f-ec5a32e25b36-kube-api-access-wbg6h\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-cert\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-metrics-certs\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-metrics-certs\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.359328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4qsm\" (UniqueName: \"kubernetes.io/projected/2ce06e1f-5454-4b85-888b-3230c0086c2e-kube-api-access-z4qsm\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.359714 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.359825 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist podName:8834c8fc-36f7-41da-867f-ec5a32e25b36 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:35.859805885 +0000 UTC m=+959.317837416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist") pod "speaker-bl9qp" (UID: "8834c8fc-36f7-41da-867f-ec5a32e25b36") : secret "metallb-memberlist" not found Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.360002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8834c8fc-36f7-41da-867f-ec5a32e25b36-metallb-excludel2\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.366124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-cert\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.367153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-metrics-certs\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.367231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ce06e1f-5454-4b85-888b-3230c0086c2e-metrics-certs\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.374727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbg6h\" (UniqueName: \"kubernetes.io/projected/8834c8fc-36f7-41da-867f-ec5a32e25b36-kube-api-access-wbg6h\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.375136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4qsm\" (UniqueName: \"kubernetes.io/projected/2ce06e1f-5454-4b85-888b-3230c0086c2e-kube-api-access-z4qsm\") pod \"controller-7bb4cc7c98-kvtc5\" (UID: \"2ce06e1f-5454-4b85-888b-3230c0086c2e\") " pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.421879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.504166 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.594557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk"] Mar 20 17:33:35 crc kubenswrapper[4795]: W0320 17:33:35.605924 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod377dbbb7_0571_40cd_9fe3_3c86fbf4f092.slice/crio-45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36 WatchSource:0}: Error finding container 45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36: Status 404 returned error can't find the container with id 45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36 Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.669179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kvtc5"] Mar 20 17:33:35 crc kubenswrapper[4795]: W0320 17:33:35.670915 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce06e1f_5454_4b85_888b_3230c0086c2e.slice/crio-432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a WatchSource:0}: Error finding container 432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a: Status 404 returned error can't find the container with id 432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.853394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"3c6895663363858ae342f3e26060bd577e242c085ed55ce1e69814251bd4289e"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.854840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" event={"ID":"377dbbb7-0571-40cd-9fe3-3c86fbf4f092","Type":"ContainerStarted","Data":"45a7b944edac1e024860a643b86626876d99698be893f8e2a415be1b99491d36"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.857661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kvtc5" event={"ID":"2ce06e1f-5454-4b85-888b-3230c0086c2e","Type":"ContainerStarted","Data":"87543faa1add05e5db5b3c208fd9c39da9bf337e18b6b46e7bb90710e114bb28"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.857730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kvtc5" event={"ID":"2ce06e1f-5454-4b85-888b-3230c0086c2e","Type":"ContainerStarted","Data":"432bc5993ebf899ffa3de0cfccca983585acaf9969b445abb90d2ac90a38840a"} Mar 20 17:33:35 crc kubenswrapper[4795]: I0320 17:33:35.867362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.867519 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:33:35 crc kubenswrapper[4795]: E0320 17:33:35.867620 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist podName:8834c8fc-36f7-41da-867f-ec5a32e25b36 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:36.867593768 +0000 UTC m=+960.325625329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist") pod "speaker-bl9qp" (UID: "8834c8fc-36f7-41da-867f-ec5a32e25b36") : secret "metallb-memberlist" not found Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.868653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kvtc5" event={"ID":"2ce06e1f-5454-4b85-888b-3230c0086c2e","Type":"ContainerStarted","Data":"73c816d4902fa5f36ea0b719f42fb9e5c6f702186fd0eaecdac1bcb98f0a70a7"} Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.869174 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.882293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.889439 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-kvtc5" podStartSLOduration=1.889418503 podStartE2EDuration="1.889418503s" podCreationTimestamp="2026-03-20 17:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:36.889150125 +0000 UTC m=+960.347181666" watchObservedRunningTime="2026-03-20 17:33:36.889418503 +0000 UTC m=+960.347450054" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.893368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8834c8fc-36f7-41da-867f-ec5a32e25b36-memberlist\") pod \"speaker-bl9qp\" (UID: \"8834c8fc-36f7-41da-867f-ec5a32e25b36\") " pod="metallb-system/speaker-bl9qp" Mar 20 17:33:36 crc kubenswrapper[4795]: I0320 17:33:36.912874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:36 crc kubenswrapper[4795]: W0320 17:33:36.934378 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8834c8fc_36f7_41da_867f_ec5a32e25b36.slice/crio-340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409 WatchSource:0}: Error finding container 340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409: Status 404 returned error can't find the container with id 340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409 Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.876880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bl9qp" event={"ID":"8834c8fc-36f7-41da-867f-ec5a32e25b36","Type":"ContainerStarted","Data":"ecf59a1d81c53ef7c38d2ffe50195692e67f83977cd308c3bb82a0e9eda4b3be"} Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.877392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bl9qp" event={"ID":"8834c8fc-36f7-41da-867f-ec5a32e25b36","Type":"ContainerStarted","Data":"36d8927e87fcdd7c48f3fb7381aac211e7f656aebf76d5e4d14eeeb505b70d3d"} Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.877404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bl9qp" event={"ID":"8834c8fc-36f7-41da-867f-ec5a32e25b36","Type":"ContainerStarted","Data":"340745e9296bee4add971b6c93233ef676603b87935ef4d0d0bf5c778589b409"} Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.877585 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:37 crc kubenswrapper[4795]: I0320 17:33:37.900120 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bl9qp" podStartSLOduration=2.900100225 podStartE2EDuration="2.900100225s" podCreationTimestamp="2026-03-20 17:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:37.895297203 +0000 UTC m=+961.353328744" watchObservedRunningTime="2026-03-20 17:33:37.900100225 +0000 UTC m=+961.358131766" Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.914405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" event={"ID":"377dbbb7-0571-40cd-9fe3-3c86fbf4f092","Type":"ContainerStarted","Data":"ca0d998564af719a65ad2a27fe4207130063d44a3182cb718440c781f43e2879"} Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.914844 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.918442 4795 generic.go:334] "Generic (PLEG): container finished" podID="a748ee28-0a26-4700-b384-3afa65b8ac9d" containerID="9ee46ca367ff1a085d923d8850b01ab5d9eccab652d16ced57a2f7ea507d3ae5" exitCode=0 Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.918501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerDied","Data":"9ee46ca367ff1a085d923d8850b01ab5d9eccab652d16ced57a2f7ea507d3ae5"} Mar 20 17:33:42 crc kubenswrapper[4795]: I0320 17:33:42.950167 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" podStartSLOduration=2.268513712 podStartE2EDuration="8.950141097s" podCreationTimestamp="2026-03-20 17:33:34 +0000 UTC" firstStartedPulling="2026-03-20 17:33:35.607441718 +0000 UTC m=+959.065473259" lastFinishedPulling="2026-03-20 17:33:42.289069103 +0000 UTC m=+965.747100644" observedRunningTime="2026-03-20 17:33:42.943536858 +0000 UTC m=+966.401568469" watchObservedRunningTime="2026-03-20 17:33:42.950141097 +0000 UTC m=+966.408172668" Mar 20 17:33:43 crc kubenswrapper[4795]: I0320 17:33:43.930638 4795 generic.go:334] "Generic (PLEG): container finished" podID="a748ee28-0a26-4700-b384-3afa65b8ac9d" containerID="98e2d9a872128eabc898525d34dbc099336810588701e0b2261f093cf837ef14" exitCode=0 Mar 20 17:33:43 crc kubenswrapper[4795]: I0320 17:33:43.930924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerDied","Data":"98e2d9a872128eabc898525d34dbc099336810588701e0b2261f093cf837ef14"} Mar 20 17:33:44 crc kubenswrapper[4795]: I0320 17:33:44.941592 4795 generic.go:334] "Generic (PLEG): container finished" podID="a748ee28-0a26-4700-b384-3afa65b8ac9d" containerID="07639cb26bcdf1ced6b6b72d8fbf3fb6114c2e3bfac1ea404f0d143cb7bf1eef" exitCode=0 Mar 20 17:33:44 crc kubenswrapper[4795]: I0320 17:33:44.941633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerDied","Data":"07639cb26bcdf1ced6b6b72d8fbf3fb6114c2e3bfac1ea404f0d143cb7bf1eef"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.429724 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-kvtc5" Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"b1dde30b0c701a10860f13a910a9e8f9803b8392320e34463cb0db966deabbf1"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"03cc33862050c1fc5ae5652d429e1f638e56e81c92a19827794c18c508ff5b0a"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"109bbf73da794c2069c5d335485f4278e573bdf15539b1d75755092218da2526"} Mar 20 17:33:45 crc kubenswrapper[4795]: I0320 17:33:45.953306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"f11a7537fed5e62acea990af82c3b5be65fb8cc112f07350709fa90157d520de"} Mar 20 17:33:46 crc kubenswrapper[4795]: I0320 17:33:46.972037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"de36b2cf48ac7ad1dd4fa320be69cbdbeab02b55529f134cda0c2a3535115728"} Mar 20 17:33:46 crc kubenswrapper[4795]: I0320 17:33:46.972089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-66lbd" event={"ID":"a748ee28-0a26-4700-b384-3afa65b8ac9d","Type":"ContainerStarted","Data":"8b13a57fdc77700dedba599b9a9a5d8568c31fb585e887369e3e6780299a60de"} Mar 20 17:33:46 crc kubenswrapper[4795]: I0320 17:33:46.972265 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:47 crc kubenswrapper[4795]: I0320 17:33:47.006176 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-66lbd" podStartSLOduration=6.221770913 podStartE2EDuration="13.006148246s" podCreationTimestamp="2026-03-20 17:33:34 +0000 UTC" firstStartedPulling="2026-03-20 17:33:35.503971387 +0000 UTC m=+958.962002928" lastFinishedPulling="2026-03-20 17:33:42.28834872 +0000 UTC m=+965.746380261" observedRunningTime="2026-03-20 17:33:47.00503104 +0000 UTC m=+970.463062611" watchObservedRunningTime="2026-03-20 17:33:47.006148246 +0000 UTC m=+970.464179797" Mar 20 17:33:50 crc kubenswrapper[4795]: I0320 17:33:50.349638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:50 crc kubenswrapper[4795]: I0320 17:33:50.404875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:55 crc kubenswrapper[4795]: I0320 17:33:55.354654 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-66lbd" Mar 20 17:33:55 crc kubenswrapper[4795]: I0320 17:33:55.365546 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jw5dk" Mar 20 17:33:56 crc kubenswrapper[4795]: I0320 17:33:56.917900 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bl9qp" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.753522 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.754395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.756710 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-glwb4" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.756875 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.758248 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.794322 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:33:59 crc kubenswrapper[4795]: I0320 17:33:59.932194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"openstack-operator-index-bhhgs\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.033839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"openstack-operator-index-bhhgs\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.057422 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"openstack-operator-index-bhhgs\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.099620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.137180 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.138161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.139498 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.140239 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.140450 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.160557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.240398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"auto-csr-approver-29567134-glqvv\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.343021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"auto-csr-approver-29567134-glqvv\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.357223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.359176 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"auto-csr-approver-29567134-glqvv\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.509286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:00 crc kubenswrapper[4795]: I0320 17:34:00.716057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:34:00 crc kubenswrapper[4795]: W0320 17:34:00.720279 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1462264f_6c8a_4024_9465_3e7d2908ba24.slice/crio-5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9 WatchSource:0}: Error finding container 5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9: Status 404 returned error can't find the container with id 5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9 Mar 20 17:34:01 crc kubenswrapper[4795]: I0320 17:34:01.063525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerStarted","Data":"0da8b87bf441877de837f21b2b0a85c9b5aca4d6bb51944bfc828ea8a620dfe8"} Mar 20 17:34:01 crc kubenswrapper[4795]: I0320 17:34:01.064263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-glqvv" event={"ID":"1462264f-6c8a-4024-9465-3e7d2908ba24","Type":"ContainerStarted","Data":"5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9"} Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.124940 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.736704 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-b6ckg"] Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.737891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.756792 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b6ckg"] Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.893749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhs4b\" (UniqueName: \"kubernetes.io/projected/3aeffd27-d2c7-4744-8e01-07a4db74597e-kube-api-access-mhs4b\") pod \"openstack-operator-index-b6ckg\" (UID: \"3aeffd27-d2c7-4744-8e01-07a4db74597e\") " pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:03 crc kubenswrapper[4795]: I0320 17:34:03.995291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhs4b\" (UniqueName: \"kubernetes.io/projected/3aeffd27-d2c7-4744-8e01-07a4db74597e-kube-api-access-mhs4b\") pod \"openstack-operator-index-b6ckg\" (UID: \"3aeffd27-d2c7-4744-8e01-07a4db74597e\") " pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.025184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhs4b\" (UniqueName: \"kubernetes.io/projected/3aeffd27-d2c7-4744-8e01-07a4db74597e-kube-api-access-mhs4b\") pod \"openstack-operator-index-b6ckg\" (UID: \"3aeffd27-d2c7-4744-8e01-07a4db74597e\") " pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.089659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.100779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerStarted","Data":"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8"} Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.100909 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-bhhgs" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" containerID="cri-o://15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" gracePeriod=2 Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.107896 4795 generic.go:334] "Generic (PLEG): container finished" podID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerID="22dcfbd2225d9c0ffa8966a0b94e82b8d86d62d5548dc394c4f180ba099a7edd" exitCode=0 Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.107971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-glqvv" event={"ID":"1462264f-6c8a-4024-9465-3e7d2908ba24","Type":"ContainerDied","Data":"22dcfbd2225d9c0ffa8966a0b94e82b8d86d62d5548dc394c4f180ba099a7edd"} Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.146437 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bhhgs" podStartSLOduration=2.40082847 podStartE2EDuration="5.146407372s" podCreationTimestamp="2026-03-20 17:33:59 +0000 UTC" firstStartedPulling="2026-03-20 17:34:00.368567392 +0000 UTC m=+983.826598933" lastFinishedPulling="2026-03-20 17:34:03.114146264 +0000 UTC m=+986.572177835" observedRunningTime="2026-03-20 17:34:04.124656085 +0000 UTC m=+987.582687686" watchObservedRunningTime="2026-03-20 17:34:04.146407372 +0000 UTC m=+987.604438953" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.326363 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b6ckg"] Mar 20 17:34:04 crc kubenswrapper[4795]: W0320 17:34:04.332963 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aeffd27_d2c7_4744_8e01_07a4db74597e.slice/crio-d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57 WatchSource:0}: Error finding container d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57: Status 404 returned error can't find the container with id d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57 Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.550339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.704259 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") pod \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\" (UID: \"ab99d32a-4e50-468c-8eb2-1f12db5e9981\") " Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.711274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq" (OuterVolumeSpecName: "kube-api-access-khnnq") pod "ab99d32a-4e50-468c-8eb2-1f12db5e9981" (UID: "ab99d32a-4e50-468c-8eb2-1f12db5e9981"). InnerVolumeSpecName "kube-api-access-khnnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:04 crc kubenswrapper[4795]: I0320 17:34:04.805583 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khnnq\" (UniqueName: \"kubernetes.io/projected/ab99d32a-4e50-468c-8eb2-1f12db5e9981-kube-api-access-khnnq\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.119825 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b6ckg" event={"ID":"3aeffd27-d2c7-4744-8e01-07a4db74597e","Type":"ContainerStarted","Data":"fd5b4a3d9366dedff95cc9a6adbd89a7be3445a11ae3e3045855ac21d8023dbb"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.119906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b6ckg" event={"ID":"3aeffd27-d2c7-4744-8e01-07a4db74597e","Type":"ContainerStarted","Data":"d5dbeca4bb915835841cb950f5de6cff7d2071530558731865ed98c519936a57"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122448 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" exitCode=0 Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bhhgs" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerDied","Data":"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bhhgs" event={"ID":"ab99d32a-4e50-468c-8eb2-1f12db5e9981","Type":"ContainerDied","Data":"0da8b87bf441877de837f21b2b0a85c9b5aca4d6bb51944bfc828ea8a620dfe8"} Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.122670 4795 scope.go:117] "RemoveContainer" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.145198 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-b6ckg" podStartSLOduration=2.071242656 podStartE2EDuration="2.145169861s" podCreationTimestamp="2026-03-20 17:34:03 +0000 UTC" firstStartedPulling="2026-03-20 17:34:04.337315131 +0000 UTC m=+987.795346672" lastFinishedPulling="2026-03-20 17:34:04.411242336 +0000 UTC m=+987.869273877" observedRunningTime="2026-03-20 17:34:05.141792975 +0000 UTC m=+988.599824617" watchObservedRunningTime="2026-03-20 17:34:05.145169861 +0000 UTC m=+988.603201442" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.162624 4795 scope.go:117] "RemoveContainer" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" Mar 20 17:34:05 crc kubenswrapper[4795]: E0320 17:34:05.165165 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8\": container with ID starting with 15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8 not found: ID does not exist" containerID="15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.165242 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8"} err="failed to get container status \"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8\": rpc error: code = NotFound desc = could not find container \"15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8\": container with ID starting with 15cf260e0ddf7853da383966deccebf03c6cd74911aeff237baac2dd67c478e8 not found: ID does not exist" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.179387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.187257 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-bhhgs"] Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.264490 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" path="/var/lib/kubelet/pods/ab99d32a-4e50-468c-8eb2-1f12db5e9981/volumes" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.438742 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.515509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") pod \"1462264f-6c8a-4024-9465-3e7d2908ba24\" (UID: \"1462264f-6c8a-4024-9465-3e7d2908ba24\") " Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.527916 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69" (OuterVolumeSpecName: "kube-api-access-lqm69") pod "1462264f-6c8a-4024-9465-3e7d2908ba24" (UID: "1462264f-6c8a-4024-9465-3e7d2908ba24"). InnerVolumeSpecName "kube-api-access-lqm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:05 crc kubenswrapper[4795]: I0320 17:34:05.617786 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqm69\" (UniqueName: \"kubernetes.io/projected/1462264f-6c8a-4024-9465-3e7d2908ba24-kube-api-access-lqm69\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.133324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-glqvv" event={"ID":"1462264f-6c8a-4024-9465-3e7d2908ba24","Type":"ContainerDied","Data":"5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9"} Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.133387 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fbdfd5d4c3e97c941aefdc8fd2ada635ddca18417b84691a6a7d39b02bbb0a9" Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.133345 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-glqvv" Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.504250 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:34:06 crc kubenswrapper[4795]: I0320 17:34:06.513050 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-bqp8h"] Mar 20 17:34:07 crc kubenswrapper[4795]: I0320 17:34:07.265521 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be9f091-42a0-432b-8f14-700bc3e733cb" path="/var/lib/kubelet/pods/4be9f091-42a0-432b-8f14-700bc3e733cb/volumes" Mar 20 17:34:11 crc kubenswrapper[4795]: I0320 17:34:11.299747 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:34:11 crc kubenswrapper[4795]: I0320 17:34:11.300177 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.090563 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.090643 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.136932 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:14 crc kubenswrapper[4795]: I0320 17:34:14.242965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-b6ckg" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.789254 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5"] Mar 20 17:34:15 crc kubenswrapper[4795]: E0320 17:34:15.789854 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.789870 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" Mar 20 17:34:15 crc kubenswrapper[4795]: E0320 17:34:15.789882 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerName="oc" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.789890 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerName="oc" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.790023 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab99d32a-4e50-468c-8eb2-1f12db5e9981" containerName="registry-server" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.790045 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" containerName="oc" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.791011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.793373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lvhj5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.803186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5"] Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.862104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.862214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.862306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.963623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.963795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.963899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.964552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.964552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:15 crc kubenswrapper[4795]: I0320 17:34:15.995768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:16 crc kubenswrapper[4795]: I0320 17:34:16.110793 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:16 crc kubenswrapper[4795]: I0320 17:34:16.616259 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5"] Mar 20 17:34:16 crc kubenswrapper[4795]: W0320 17:34:16.628009 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7571109_7ce9_44a8_9275_4af4fadbd0e6.slice/crio-346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321 WatchSource:0}: Error finding container 346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321: Status 404 returned error can't find the container with id 346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321 Mar 20 17:34:17 crc kubenswrapper[4795]: I0320 17:34:17.221548 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerID="2d67a9a380a082071bfa78d21716eb4b1e97fe88d534c7e295d5fcb7891dfacc" exitCode=0 Mar 20 17:34:17 crc kubenswrapper[4795]: I0320 17:34:17.221714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"2d67a9a380a082071bfa78d21716eb4b1e97fe88d534c7e295d5fcb7891dfacc"} Mar 20 17:34:17 crc kubenswrapper[4795]: I0320 17:34:17.221928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerStarted","Data":"346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321"} Mar 20 17:34:18 crc kubenswrapper[4795]: I0320 17:34:18.235196 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerID="90bee4baf4c24459457ba73d1c0b53d701cf2273d32aa38f67a9d73df0665a44" exitCode=0 Mar 20 17:34:18 crc kubenswrapper[4795]: I0320 17:34:18.235276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"90bee4baf4c24459457ba73d1c0b53d701cf2273d32aa38f67a9d73df0665a44"} Mar 20 17:34:19 crc kubenswrapper[4795]: I0320 17:34:19.244025 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerID="5e0c33eb1ff9dd81990fd915b2c9bc30c9229c45998fe36c98c1b059681a7211" exitCode=0 Mar 20 17:34:19 crc kubenswrapper[4795]: I0320 17:34:19.244088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"5e0c33eb1ff9dd81990fd915b2c9bc30c9229c45998fe36c98c1b059681a7211"} Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.536031 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.628579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") pod \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.628763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") pod \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.628828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") pod \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\" (UID: \"f7571109-7ce9-44a8-9275-4af4fadbd0e6\") " Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.629516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle" (OuterVolumeSpecName: "bundle") pod "f7571109-7ce9-44a8-9275-4af4fadbd0e6" (UID: "f7571109-7ce9-44a8-9275-4af4fadbd0e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.634807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8" (OuterVolumeSpecName: "kube-api-access-dwhb8") pod "f7571109-7ce9-44a8-9275-4af4fadbd0e6" (UID: "f7571109-7ce9-44a8-9275-4af4fadbd0e6"). InnerVolumeSpecName "kube-api-access-dwhb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.648567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util" (OuterVolumeSpecName: "util") pod "f7571109-7ce9-44a8-9275-4af4fadbd0e6" (UID: "f7571109-7ce9-44a8-9275-4af4fadbd0e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.729909 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.729962 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7571109-7ce9-44a8-9275-4af4fadbd0e6-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:20 crc kubenswrapper[4795]: I0320 17:34:20.729974 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhb8\" (UniqueName: \"kubernetes.io/projected/f7571109-7ce9-44a8-9275-4af4fadbd0e6-kube-api-access-dwhb8\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4795]: I0320 17:34:21.263034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" event={"ID":"f7571109-7ce9-44a8-9275-4af4fadbd0e6","Type":"ContainerDied","Data":"346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321"} Mar 20 17:34:21 crc kubenswrapper[4795]: I0320 17:34:21.263101 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="346c325c2182b7ad4c0d3e6dfececfa2aa1d6b8414272b1454e3ed65513f6321" Mar 20 17:34:21 crc kubenswrapper[4795]: I0320 17:34:21.263164 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074385 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j"] Mar 20 17:34:23 crc kubenswrapper[4795]: E0320 17:34:23.074918 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="pull" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074936 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="pull" Mar 20 17:34:23 crc kubenswrapper[4795]: E0320 17:34:23.074958 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="util" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074967 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="util" Mar 20 17:34:23 crc kubenswrapper[4795]: E0320 17:34:23.074985 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="extract" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.074993 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="extract" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.075117 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7571109-7ce9-44a8-9275-4af4fadbd0e6" containerName="extract" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.075595 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.077494 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rhg5v" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.119074 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j"] Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.162604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2694\" (UniqueName: \"kubernetes.io/projected/084071f5-e58b-451b-9cf5-67203ae1ba02-kube-api-access-c2694\") pod \"openstack-operator-controller-init-65b67cc5c9-vm29j\" (UID: \"084071f5-e58b-451b-9cf5-67203ae1ba02\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.263604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2694\" (UniqueName: \"kubernetes.io/projected/084071f5-e58b-451b-9cf5-67203ae1ba02-kube-api-access-c2694\") pod \"openstack-operator-controller-init-65b67cc5c9-vm29j\" (UID: \"084071f5-e58b-451b-9cf5-67203ae1ba02\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.284574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2694\" (UniqueName: \"kubernetes.io/projected/084071f5-e58b-451b-9cf5-67203ae1ba02-kube-api-access-c2694\") pod \"openstack-operator-controller-init-65b67cc5c9-vm29j\" (UID: \"084071f5-e58b-451b-9cf5-67203ae1ba02\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.442236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:23 crc kubenswrapper[4795]: I0320 17:34:23.906349 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j"] Mar 20 17:34:23 crc kubenswrapper[4795]: W0320 17:34:23.914987 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod084071f5_e58b_451b_9cf5_67203ae1ba02.slice/crio-b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741 WatchSource:0}: Error finding container b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741: Status 404 returned error can't find the container with id b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741 Mar 20 17:34:24 crc kubenswrapper[4795]: I0320 17:34:24.280678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" event={"ID":"084071f5-e58b-451b-9cf5-67203ae1ba02","Type":"ContainerStarted","Data":"b8283927e967447ba4ad8521e5dd65fe00d031e5039dee36b7199bf437437741"} Mar 20 17:34:29 crc kubenswrapper[4795]: I0320 17:34:29.325005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" event={"ID":"084071f5-e58b-451b-9cf5-67203ae1ba02","Type":"ContainerStarted","Data":"8640157f9b5a55f800571b9991838d07d033a444ab6bc55fe715cab41c344de7"} Mar 20 17:34:29 crc kubenswrapper[4795]: I0320 17:34:29.325496 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:29 crc kubenswrapper[4795]: I0320 17:34:29.368055 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" podStartSLOduration=2.050125497 podStartE2EDuration="6.368036782s" podCreationTimestamp="2026-03-20 17:34:23 +0000 UTC" firstStartedPulling="2026-03-20 17:34:23.917832671 +0000 UTC m=+1007.375864202" lastFinishedPulling="2026-03-20 17:34:28.235743946 +0000 UTC m=+1011.693775487" observedRunningTime="2026-03-20 17:34:29.365363778 +0000 UTC m=+1012.823395319" watchObservedRunningTime="2026-03-20 17:34:29.368036782 +0000 UTC m=+1012.826068333" Mar 20 17:34:33 crc kubenswrapper[4795]: I0320 17:34:33.451243 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-vm29j" Mar 20 17:34:41 crc kubenswrapper[4795]: I0320 17:34:41.300030 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:34:41 crc kubenswrapper[4795]: I0320 17:34:41.300753 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:34:49 crc kubenswrapper[4795]: I0320 17:34:49.534085 4795 scope.go:117] "RemoveContainer" containerID="326b8c75bc495d3f796856aa4f0f247f31974ad88ddb26ad9ca2ca9ec8cf372a" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.508581 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.510177 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.524835 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lljpv" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.547341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.549009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g56lz\" (UniqueName: \"kubernetes.io/projected/afefdb79-bad6-4deb-904b-515174cca414-kube-api-access-g56lz\") pod \"barbican-operator-controller-manager-59bc569d95-5hzvs\" (UID: \"afefdb79-bad6-4deb-904b-515174cca414\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.558271 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.560217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.568186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4bq59" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.571633 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.581899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.584248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-q58d8" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.599098 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.605413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.618741 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.620898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.622567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-84jqr" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.627899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.629494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.632396 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kqswl" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.633718 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmwj\" (UniqueName: \"kubernetes.io/projected/21481bba-04ec-47ce-95d0-fe27787a3d62-kube-api-access-zwmwj\") pod \"cinder-operator-controller-manager-8d58dc466-h9f9t\" (UID: \"21481bba-04ec-47ce-95d0-fe27787a3d62\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4dhn\" (UniqueName: \"kubernetes.io/projected/43804d6b-2358-46fd-bf04-26b2308f8ab0-kube-api-access-r4dhn\") pod \"designate-operator-controller-manager-588d4d986b-jgs27\" (UID: \"43804d6b-2358-46fd-bf04-26b2308f8ab0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650478 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59xc\" (UniqueName: \"kubernetes.io/projected/a957ef3d-357c-4aa4-865c-533f889257d7-kube-api-access-t59xc\") pod \"glance-operator-controller-manager-79df6bcc97-dwx6n\" (UID: \"a957ef3d-357c-4aa4-865c-533f889257d7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g56lz\" (UniqueName: \"kubernetes.io/projected/afefdb79-bad6-4deb-904b-515174cca414-kube-api-access-g56lz\") pod \"barbican-operator-controller-manager-59bc569d95-5hzvs\" (UID: \"afefdb79-bad6-4deb-904b-515174cca414\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.650544 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mlgc\" (UniqueName: \"kubernetes.io/projected/4cdd16c5-b7d3-4c52-a286-f3555daf43d9-kube-api-access-4mlgc\") pod \"heat-operator-controller-manager-67dd5f86f5-rmcrf\" (UID: \"4cdd16c5-b7d3-4c52-a286-f3555daf43d9\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.651204 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.655373 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.656856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.663258 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bfff2" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.667795 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.674313 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.675455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.678230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wc277" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.679148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.679992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.687801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g56lz\" (UniqueName: \"kubernetes.io/projected/afefdb79-bad6-4deb-904b-515174cca414-kube-api-access-g56lz\") pod \"barbican-operator-controller-manager-59bc569d95-5hzvs\" (UID: \"afefdb79-bad6-4deb-904b-515174cca414\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.693067 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.695675 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.698629 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.703063 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6b7c8" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.718775 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.719557 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.724663 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qf8fp" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.727753 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-trjt4"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.729029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.738220 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.738364 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rzr9l" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmklc\" (UniqueName: \"kubernetes.io/projected/84901a7b-ddbf-47d9-954f-c167cd9cd46c-kube-api-access-fmklc\") pod \"keystone-operator-controller-manager-768b96df4c-6hsxn\" (UID: \"84901a7b-ddbf-47d9-954f-c167cd9cd46c\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvmkp\" (UniqueName: \"kubernetes.io/projected/9cba9cd3-4144-4262-82a2-f2330793aae6-kube-api-access-nvmkp\") pod \"ironic-operator-controller-manager-6f787dddc9-55vp5\" (UID: \"9cba9cd3-4144-4262-82a2-f2330793aae6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mlgc\" (UniqueName: \"kubernetes.io/projected/4cdd16c5-b7d3-4c52-a286-f3555daf43d9-kube-api-access-4mlgc\") pod \"heat-operator-controller-manager-67dd5f86f5-rmcrf\" (UID: \"4cdd16c5-b7d3-4c52-a286-f3555daf43d9\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmwj\" (UniqueName: \"kubernetes.io/projected/21481bba-04ec-47ce-95d0-fe27787a3d62-kube-api-access-zwmwj\") pod \"cinder-operator-controller-manager-8d58dc466-h9f9t\" (UID: \"21481bba-04ec-47ce-95d0-fe27787a3d62\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfzh\" (UniqueName: \"kubernetes.io/projected/7a887d91-fa86-45d2-a6be-aa7326f7d544-kube-api-access-mpfzh\") pod \"manila-operator-controller-manager-55f864c847-trjt4\" (UID: \"7a887d91-fa86-45d2-a6be-aa7326f7d544\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4dhn\" (UniqueName: \"kubernetes.io/projected/43804d6b-2358-46fd-bf04-26b2308f8ab0-kube-api-access-r4dhn\") pod \"designate-operator-controller-manager-588d4d986b-jgs27\" (UID: \"43804d6b-2358-46fd-bf04-26b2308f8ab0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vfms\" (UniqueName: \"kubernetes.io/projected/ded84ba8-d70a-4379-bc80-d142e5306cc7-kube-api-access-8vfms\") pod \"horizon-operator-controller-manager-8464cc45fb-f74p9\" (UID: \"ded84ba8-d70a-4379-bc80-d142e5306cc7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdccc\" (UniqueName: \"kubernetes.io/projected/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-kube-api-access-fdccc\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.751909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59xc\" (UniqueName: \"kubernetes.io/projected/a957ef3d-357c-4aa4-865c-533f889257d7-kube-api-access-t59xc\") pod \"glance-operator-controller-manager-79df6bcc97-dwx6n\" (UID: \"a957ef3d-357c-4aa4-865c-533f889257d7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.758442 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-trjt4"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.763808 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.764770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.770066 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z48z9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.775667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmwj\" (UniqueName: \"kubernetes.io/projected/21481bba-04ec-47ce-95d0-fe27787a3d62-kube-api-access-zwmwj\") pod \"cinder-operator-controller-manager-8d58dc466-h9f9t\" (UID: \"21481bba-04ec-47ce-95d0-fe27787a3d62\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.784262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4dhn\" (UniqueName: \"kubernetes.io/projected/43804d6b-2358-46fd-bf04-26b2308f8ab0-kube-api-access-r4dhn\") pod \"designate-operator-controller-manager-588d4d986b-jgs27\" (UID: \"43804d6b-2358-46fd-bf04-26b2308f8ab0\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.787902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mlgc\" (UniqueName: \"kubernetes.io/projected/4cdd16c5-b7d3-4c52-a286-f3555daf43d9-kube-api-access-4mlgc\") pod \"heat-operator-controller-manager-67dd5f86f5-rmcrf\" (UID: \"4cdd16c5-b7d3-4c52-a286-f3555daf43d9\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.798393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.800462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59xc\" (UniqueName: \"kubernetes.io/projected/a957ef3d-357c-4aa4-865c-533f889257d7-kube-api-access-t59xc\") pod \"glance-operator-controller-manager-79df6bcc97-dwx6n\" (UID: \"a957ef3d-357c-4aa4-865c-533f889257d7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.836165 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.837017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.842099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7zpb9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.853738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfzh\" (UniqueName: \"kubernetes.io/projected/7a887d91-fa86-45d2-a6be-aa7326f7d544-kube-api-access-mpfzh\") pod \"manila-operator-controller-manager-55f864c847-trjt4\" (UID: \"7a887d91-fa86-45d2-a6be-aa7326f7d544\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gddpx\" (UniqueName: \"kubernetes.io/projected/071f0af8-4164-4f95-b0ee-720e3b3097f3-kube-api-access-gddpx\") pod \"mariadb-operator-controller-manager-67ccfc9778-jfdzb\" (UID: \"071f0af8-4164-4f95-b0ee-720e3b3097f3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854498 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vfms\" (UniqueName: \"kubernetes.io/projected/ded84ba8-d70a-4379-bc80-d142e5306cc7-kube-api-access-8vfms\") pod \"horizon-operator-controller-manager-8464cc45fb-f74p9\" (UID: \"ded84ba8-d70a-4379-bc80-d142e5306cc7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdccc\" (UniqueName: \"kubernetes.io/projected/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-kube-api-access-fdccc\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmklc\" (UniqueName: \"kubernetes.io/projected/84901a7b-ddbf-47d9-954f-c167cd9cd46c-kube-api-access-fmklc\") pod \"keystone-operator-controller-manager-768b96df4c-6hsxn\" (UID: \"84901a7b-ddbf-47d9-954f-c167cd9cd46c\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvmkp\" (UniqueName: \"kubernetes.io/projected/9cba9cd3-4144-4262-82a2-f2330793aae6-kube-api-access-nvmkp\") pod \"ironic-operator-controller-manager-6f787dddc9-55vp5\" (UID: \"9cba9cd3-4144-4262-82a2-f2330793aae6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.854586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: E0320 17:34:51.854725 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:51 crc kubenswrapper[4795]: E0320 17:34:51.854770 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:52.354755457 +0000 UTC m=+1035.812786998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.865493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.875294 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.879212 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.880469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.881789 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.887257 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-glwpx" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.887731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-d8klm" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.892779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.896806 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vfms\" (UniqueName: \"kubernetes.io/projected/ded84ba8-d70a-4379-bc80-d142e5306cc7-kube-api-access-8vfms\") pod \"horizon-operator-controller-manager-8464cc45fb-f74p9\" (UID: \"ded84ba8-d70a-4379-bc80-d142e5306cc7\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.897081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.898140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvmkp\" (UniqueName: \"kubernetes.io/projected/9cba9cd3-4144-4262-82a2-f2330793aae6-kube-api-access-nvmkp\") pod \"ironic-operator-controller-manager-6f787dddc9-55vp5\" (UID: \"9cba9cd3-4144-4262-82a2-f2330793aae6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.904655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.905448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdccc\" (UniqueName: \"kubernetes.io/projected/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-kube-api-access-fdccc\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.908141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfzh\" (UniqueName: \"kubernetes.io/projected/7a887d91-fa86-45d2-a6be-aa7326f7d544-kube-api-access-mpfzh\") pod \"manila-operator-controller-manager-55f864c847-trjt4\" (UID: \"7a887d91-fa86-45d2-a6be-aa7326f7d544\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.912667 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.913255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmklc\" (UniqueName: \"kubernetes.io/projected/84901a7b-ddbf-47d9-954f-c167cd9cd46c-kube-api-access-fmklc\") pod \"keystone-operator-controller-manager-768b96df4c-6hsxn\" (UID: \"84901a7b-ddbf-47d9-954f-c167cd9cd46c\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.913906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.924413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tshl6" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.934572 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.935427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.938472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j696n" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.938669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.989178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.990503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz"] Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.992316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:34:51 crc kubenswrapper[4795]: I0320 17:34:51.992897 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6wb\" (UniqueName: \"kubernetes.io/projected/d4ff6977-1303-4267-983e-3e99935f2aae-kube-api-access-lk6wb\") pod \"octavia-operator-controller-manager-5b9f45d989-n7cl7\" (UID: \"d4ff6977-1303-4267-983e-3e99935f2aae\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pdhx\" (UniqueName: \"kubernetes.io/projected/0ffe016b-8919-4b8f-839c-669637b7accc-kube-api-access-9pdhx\") pod \"neutron-operator-controller-manager-767865f676-bqzcz\" (UID: \"0ffe016b-8919-4b8f-839c-669637b7accc\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gddpx\" (UniqueName: \"kubernetes.io/projected/071f0af8-4164-4f95-b0ee-720e3b3097f3-kube-api-access-gddpx\") pod \"mariadb-operator-controller-manager-67ccfc9778-jfdzb\" (UID: \"071f0af8-4164-4f95-b0ee-720e3b3097f3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spfbz\" (UniqueName: \"kubernetes.io/projected/84a19583-b173-4fb9-8b83-d9c41a5faf79-kube-api-access-spfbz\") pod \"ovn-operator-controller-manager-884679f54-dtfmz\" (UID: \"84a19583-b173-4fb9-8b83-d9c41a5faf79\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttmq\" (UniqueName: \"kubernetes.io/projected/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-kube-api-access-qttmq\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:51.996460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct28n\" (UniqueName: \"kubernetes.io/projected/0da03e08-561c-4b5f-89c7-af80c8f39f54-kube-api-access-ct28n\") pod \"nova-operator-controller-manager-5d488d59fb-5v5sg\" (UID: \"0da03e08-561c-4b5f-89c7-af80c8f39f54\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.002710 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.005796 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.009540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cn87b" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.025475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gddpx\" (UniqueName: \"kubernetes.io/projected/071f0af8-4164-4f95-b0ee-720e3b3097f3-kube-api-access-gddpx\") pod \"mariadb-operator-controller-manager-67ccfc9778-jfdzb\" (UID: \"071f0af8-4164-4f95-b0ee-720e3b3097f3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.045749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.051387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.053891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.073997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.078841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.086331 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-828jr"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.087394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spfbz\" (UniqueName: \"kubernetes.io/projected/84a19583-b173-4fb9-8b83-d9c41a5faf79-kube-api-access-spfbz\") pod \"ovn-operator-controller-manager-884679f54-dtfmz\" (UID: \"84a19583-b173-4fb9-8b83-d9c41a5faf79\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttmq\" (UniqueName: \"kubernetes.io/projected/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-kube-api-access-qttmq\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct28n\" (UniqueName: \"kubernetes.io/projected/0da03e08-561c-4b5f-89c7-af80c8f39f54-kube-api-access-ct28n\") pod \"nova-operator-controller-manager-5d488d59fb-5v5sg\" (UID: \"0da03e08-561c-4b5f-89c7-af80c8f39f54\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.099978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6wb\" (UniqueName: \"kubernetes.io/projected/d4ff6977-1303-4267-983e-3e99935f2aae-kube-api-access-lk6wb\") pod \"octavia-operator-controller-manager-5b9f45d989-n7cl7\" (UID: \"d4ff6977-1303-4267-983e-3e99935f2aae\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.100017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pdhx\" (UniqueName: \"kubernetes.io/projected/0ffe016b-8919-4b8f-839c-669637b7accc-kube-api-access-9pdhx\") pod \"neutron-operator-controller-manager-767865f676-bqzcz\" (UID: \"0ffe016b-8919-4b8f-839c-669637b7accc\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.100052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mckr\" (UniqueName: \"kubernetes.io/projected/b47e6216-2e29-4d58-8b0c-5970aee6307b-kube-api-access-8mckr\") pod \"placement-operator-controller-manager-5784578c99-6cw7v\" (UID: \"b47e6216-2e29-4d58-8b0c-5970aee6307b\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.100092 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.100173 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:52.600144395 +0000 UTC m=+1036.058175936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.101509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fphs6" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.113961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-828jr"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.132916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct28n\" (UniqueName: \"kubernetes.io/projected/0da03e08-561c-4b5f-89c7-af80c8f39f54-kube-api-access-ct28n\") pod \"nova-operator-controller-manager-5d488d59fb-5v5sg\" (UID: \"0da03e08-561c-4b5f-89c7-af80c8f39f54\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.133327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spfbz\" (UniqueName: \"kubernetes.io/projected/84a19583-b173-4fb9-8b83-d9c41a5faf79-kube-api-access-spfbz\") pod \"ovn-operator-controller-manager-884679f54-dtfmz\" (UID: \"84a19583-b173-4fb9-8b83-d9c41a5faf79\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.135785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pdhx\" (UniqueName: \"kubernetes.io/projected/0ffe016b-8919-4b8f-839c-669637b7accc-kube-api-access-9pdhx\") pod \"neutron-operator-controller-manager-767865f676-bqzcz\" (UID: \"0ffe016b-8919-4b8f-839c-669637b7accc\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.136350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6wb\" (UniqueName: \"kubernetes.io/projected/d4ff6977-1303-4267-983e-3e99935f2aae-kube-api-access-lk6wb\") pod \"octavia-operator-controller-manager-5b9f45d989-n7cl7\" (UID: \"d4ff6977-1303-4267-983e-3e99935f2aae\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.140943 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.141554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttmq\" (UniqueName: \"kubernetes.io/projected/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-kube-api-access-qttmq\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.154898 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.156019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.160252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7779x" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.177244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.178348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.187438 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.188969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.191489 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vvgjj" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.203828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.208322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbjs\" (UniqueName: \"kubernetes.io/projected/750d9405-0514-4876-821e-9ab1f6871e87-kube-api-access-4rbjs\") pod \"swift-operator-controller-manager-c674c5965-828jr\" (UID: \"750d9405-0514-4876-821e-9ab1f6871e87\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.208455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mckr\" (UniqueName: \"kubernetes.io/projected/b47e6216-2e29-4d58-8b0c-5970aee6307b-kube-api-access-8mckr\") pod \"placement-operator-controller-manager-5784578c99-6cw7v\" (UID: \"b47e6216-2e29-4d58-8b0c-5970aee6307b\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.210415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.218051 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.218913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.226295 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.227270 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.230625 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lvr2r" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.237333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mckr\" (UniqueName: \"kubernetes.io/projected/b47e6216-2e29-4d58-8b0c-5970aee6307b-kube-api-access-8mckr\") pod \"placement-operator-controller-manager-5784578c99-6cw7v\" (UID: \"b47e6216-2e29-4d58-8b0c-5970aee6307b\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.284098 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.284947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.285942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.287966 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.288079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-h5tth" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.288509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.290478 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4549\" (UniqueName: \"kubernetes.io/projected/46248665-6f9f-46e0-8db7-6be8c47cf521-kube-api-access-q4549\") pod \"telemetry-operator-controller-manager-d6b694c5-jbwss\" (UID: \"46248665-6f9f-46e0-8db7-6be8c47cf521\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp47s\" (UniqueName: \"kubernetes.io/projected/933bcfd5-f2d1-404f-876d-1d3da597f415-kube-api-access-qp47s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6z7j5\" (UID: \"933bcfd5-f2d1-404f-876d-1d3da597f415\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqzw\" (UniqueName: \"kubernetes.io/projected/e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38-kube-api-access-tzqzw\") pod \"test-operator-controller-manager-5c5cb9c4d7-rv5df\" (UID: \"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.309582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbjs\" (UniqueName: \"kubernetes.io/projected/750d9405-0514-4876-821e-9ab1f6871e87-kube-api-access-4rbjs\") pod \"swift-operator-controller-manager-c674c5965-828jr\" (UID: \"750d9405-0514-4876-821e-9ab1f6871e87\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.327181 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbjs\" (UniqueName: \"kubernetes.io/projected/750d9405-0514-4876-821e-9ab1f6871e87-kube-api-access-4rbjs\") pod \"swift-operator-controller-manager-c674c5965-828jr\" (UID: \"750d9405-0514-4876-821e-9ab1f6871e87\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.350550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4549\" (UniqueName: \"kubernetes.io/projected/46248665-6f9f-46e0-8db7-6be8c47cf521-kube-api-access-q4549\") pod \"telemetry-operator-controller-manager-d6b694c5-jbwss\" (UID: \"46248665-6f9f-46e0-8db7-6be8c47cf521\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp47s\" (UniqueName: \"kubernetes.io/projected/933bcfd5-f2d1-404f-876d-1d3da597f415-kube-api-access-qp47s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6z7j5\" (UID: \"933bcfd5-f2d1-404f-876d-1d3da597f415\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410682 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqzw\" (UniqueName: \"kubernetes.io/projected/e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38-kube-api-access-tzqzw\") pod \"test-operator-controller-manager-5c5cb9c4d7-rv5df\" (UID: \"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvzm\" (UniqueName: \"kubernetes.io/projected/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-kube-api-access-grvzm\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.410859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.411344 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.411391 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.411376244 +0000 UTC m=+1036.869407785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.437418 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.442100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqzw\" (UniqueName: \"kubernetes.io/projected/e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38-kube-api-access-tzqzw\") pod \"test-operator-controller-manager-5c5cb9c4d7-rv5df\" (UID: \"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.452622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp47s\" (UniqueName: \"kubernetes.io/projected/933bcfd5-f2d1-404f-876d-1d3da597f415-kube-api-access-qp47s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-6z7j5\" (UID: \"933bcfd5-f2d1-404f-876d-1d3da597f415\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.452635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4549\" (UniqueName: \"kubernetes.io/projected/46248665-6f9f-46e0-8db7-6be8c47cf521-kube-api-access-q4549\") pod \"telemetry-operator-controller-manager-d6b694c5-jbwss\" (UID: \"46248665-6f9f-46e0-8db7-6be8c47cf521\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.466399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.504150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.513386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvzm\" (UniqueName: \"kubernetes.io/projected/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-kube-api-access-grvzm\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.513458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.513518 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.513708 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.513764 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.013745816 +0000 UTC m=+1036.471777357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.514167 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.514197 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.01418772 +0000 UTC m=+1036.472219261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.534880 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.539547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.544641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvzm\" (UniqueName: \"kubernetes.io/projected/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-kube-api-access-grvzm\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.557011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.614425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.615042 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: E0320 17:34:52.615122 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:53.615103567 +0000 UTC m=+1037.073135108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.666909 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.674272 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.682431 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.847976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf"] Mar 20 17:34:52 crc kubenswrapper[4795]: W0320 17:34:52.857053 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cdd16c5_b7d3_4c52_a286_f3555daf43d9.slice/crio-2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b WatchSource:0}: Error finding container 2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b: Status 404 returned error can't find the container with id 2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.888228 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.899546 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn"] Mar 20 17:34:52 crc kubenswrapper[4795]: I0320 17:34:52.905988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz"] Mar 20 17:34:52 crc kubenswrapper[4795]: W0320 17:34:52.914360 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ffe016b_8919_4b8f_839c_669637b7accc.slice/crio-36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82 WatchSource:0}: Error finding container 36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82: Status 404 returned error can't find the container with id 36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82 Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.034482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.034564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034699 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034783 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034786 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:54.034767569 +0000 UTC m=+1037.492799100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.034857 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:54.034836842 +0000 UTC m=+1037.492868493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.059360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.068368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb"] Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.075727 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4ff6977_1303_4267_983e_3e99935f2aae.slice/crio-34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272 WatchSource:0}: Error finding container 34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272: Status 404 returned error can't find the container with id 34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272 Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.086762 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.092821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.100115 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz"] Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.103261 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gddpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-jfdzb_openstack-operators(071f0af8-4164-4f95-b0ee-720e3b3097f3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.103344 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mpfzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-trjt4_openstack-operators(7a887d91-fa86-45d2-a6be-aa7326f7d544): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.104830 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a19583_b173_4fb9_8b83_d9c41a5faf79.slice/crio-86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c WatchSource:0}: Error finding container 86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c: Status 404 returned error can't find the container with id 86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.104873 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podUID="7a887d91-fa86-45d2-a6be-aa7326f7d544" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.104907 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podUID="071f0af8-4164-4f95-b0ee-720e3b3097f3" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.106620 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-trjt4"] Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.109059 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-spfbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-dtfmz_openstack-operators(84a19583-b173-4fb9-8b83-d9c41a5faf79): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.110191 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podUID="84a19583-b173-4fb9-8b83-d9c41a5faf79" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.266152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss"] Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.273713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-828jr"] Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.276651 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46248665_6f9f_46e0_8db7_6be8c47cf521.slice/crio-5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a WatchSource:0}: Error finding container 5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a: Status 404 returned error can't find the container with id 5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.277314 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod933bcfd5_f2d1_404f_876d_1d3da597f415.slice/crio-8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc WatchSource:0}: Error finding container 8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc: Status 404 returned error can't find the container with id 8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.278402 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q4549,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-jbwss_openstack-operators(46248665-6f9f-46e0-8db7-6be8c47cf521): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.279603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5"] Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.279754 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podUID="46248665-6f9f-46e0-8db7-6be8c47cf521" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.279735 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qp47s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-6z7j5_openstack-operators(933bcfd5-f2d1-404f-876d-1d3da597f415): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.281055 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podUID="933bcfd5-f2d1-404f-876d-1d3da597f415" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.354121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df"] Mar 20 17:34:53 crc kubenswrapper[4795]: W0320 17:34:53.367402 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13cbad4_3a2f_4b3c_82d8_c3984c5a9f38.slice/crio-4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6 WatchSource:0}: Error finding container 4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6: Status 404 returned error can't find the container with id 4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6 Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.443359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.443553 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.444054 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:55.444034424 +0000 UTC m=+1038.902065965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.484049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" event={"ID":"071f0af8-4164-4f95-b0ee-720e3b3097f3","Type":"ContainerStarted","Data":"4cabaf11a79fb91abe478b428ede08dbdc503226c0356910eb3929edc7880bbc"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.485365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podUID="071f0af8-4164-4f95-b0ee-720e3b3097f3" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.486937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" event={"ID":"933bcfd5-f2d1-404f-876d-1d3da597f415","Type":"ContainerStarted","Data":"8420b7743edfbf8bffb9b669d570a888ca4c6676d696d42109d26c020d0db1bc"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.489406 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podUID="933bcfd5-f2d1-404f-876d-1d3da597f415" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.489863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" event={"ID":"750d9405-0514-4876-821e-9ab1f6871e87","Type":"ContainerStarted","Data":"85cc59475d6cbb9466309ccdd9b0cf283924bdd8183fe824ba562e2dca74278d"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.491613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" event={"ID":"0ffe016b-8919-4b8f-839c-669637b7accc","Type":"ContainerStarted","Data":"36d91a794c1f24f381f6acd78c49047ff9b43348de16ced056638998ec2f4d82"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.496497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" event={"ID":"9cba9cd3-4144-4262-82a2-f2330793aae6","Type":"ContainerStarted","Data":"f5b05db5be497c5a21296390e1a3cbff935b25a2884fcf202309945caad66f20"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.503595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" event={"ID":"7a887d91-fa86-45d2-a6be-aa7326f7d544","Type":"ContainerStarted","Data":"cd30e6659db0586c6042a15cd77a7a06d5704aab30efe088dada386528984038"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.504940 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podUID="7a887d91-fa86-45d2-a6be-aa7326f7d544" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.505227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" event={"ID":"afefdb79-bad6-4deb-904b-515174cca414","Type":"ContainerStarted","Data":"2f3a456e174ab8c426e180d784e33c498cc2bde40a390a29114ce9a43c92dc8f"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.509146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" event={"ID":"b47e6216-2e29-4d58-8b0c-5970aee6307b","Type":"ContainerStarted","Data":"737e1ba361aadea472d34b641c5f02bf98bbf03ae64cee976c494a4d8a141b4f"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.510270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" event={"ID":"0da03e08-561c-4b5f-89c7-af80c8f39f54","Type":"ContainerStarted","Data":"673a499c42bd510dfbf994dd82475e1399cdc7a7713ed5e4d6aaf6b083417661"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.513278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" event={"ID":"d4ff6977-1303-4267-983e-3e99935f2aae","Type":"ContainerStarted","Data":"34c30595d15558a4f68e2dcd0777715ef56171787e2d05b001251128e2fa1272"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.514919 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" event={"ID":"a957ef3d-357c-4aa4-865c-533f889257d7","Type":"ContainerStarted","Data":"5cb13b1a6325a4b0a0726c57569fa8a23c806ea420c213c17966bbbbc30c65ae"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.517676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" event={"ID":"ded84ba8-d70a-4379-bc80-d142e5306cc7","Type":"ContainerStarted","Data":"5d134f13c7cca2bcc01cdf1ec614bc32120abb5f48e12b0f8c7864d98d632ba3"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.519581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" event={"ID":"43804d6b-2358-46fd-bf04-26b2308f8ab0","Type":"ContainerStarted","Data":"59bacb9f3e843e29b92fd9a0dc2487c591a4c65c27efe147724842748fa8562d"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.530971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" event={"ID":"21481bba-04ec-47ce-95d0-fe27787a3d62","Type":"ContainerStarted","Data":"22226aa722815ae1f8e9787887d926498e3b5e994cde1ce60fe7afd0d17b2fbc"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.533351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" event={"ID":"4cdd16c5-b7d3-4c52-a286-f3555daf43d9","Type":"ContainerStarted","Data":"2fcb6e288ea33996299490589e0e5ef05a767952fd2ed908f11f572717d8004b"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.535169 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" event={"ID":"84901a7b-ddbf-47d9-954f-c167cd9cd46c","Type":"ContainerStarted","Data":"dd3a72b63d60bc17f484898dc47dca0889e536ec15f9cf9c214941e06449316c"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.536345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" event={"ID":"46248665-6f9f-46e0-8db7-6be8c47cf521","Type":"ContainerStarted","Data":"5ba90a5941ecbc6901bc5bd3c7ee3fb88cb6af6a43c0147d606d4c3681463a9a"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.537760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podUID="46248665-6f9f-46e0-8db7-6be8c47cf521" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.537943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" event={"ID":"84a19583-b173-4fb9-8b83-d9c41a5faf79","Type":"ContainerStarted","Data":"86c005f15d069a320e8080c8d86af12e7ae2278f72d07fdfbde275f97c00079c"} Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.538996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" event={"ID":"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38","Type":"ContainerStarted","Data":"4546fc64daf773de573638d594c554e1efe0ce9a7c9ee345711100f2b0aaaed6"} Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.539207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podUID="84a19583-b173-4fb9-8b83-d9c41a5faf79" Mar 20 17:34:53 crc kubenswrapper[4795]: I0320 17:34:53.646730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.646891 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:53 crc kubenswrapper[4795]: E0320 17:34:53.646975 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:55.646954252 +0000 UTC m=+1039.104985853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: I0320 17:34:54.050567 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:54 crc kubenswrapper[4795]: I0320 17:34:54.050676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050745 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050813 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:56.050796564 +0000 UTC m=+1039.508828105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050824 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.050896 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:34:56.050878157 +0000 UTC m=+1039.508909698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.552060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podUID="46248665-6f9f-46e0-8db7-6be8c47cf521" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.552456 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podUID="84a19583-b173-4fb9-8b83-d9c41a5faf79" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.552929 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podUID="7a887d91-fa86-45d2-a6be-aa7326f7d544" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.553755 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podUID="071f0af8-4164-4f95-b0ee-720e3b3097f3" Mar 20 17:34:54 crc kubenswrapper[4795]: E0320 17:34:54.560207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podUID="933bcfd5-f2d1-404f-876d-1d3da597f415" Mar 20 17:34:55 crc kubenswrapper[4795]: I0320 17:34:55.470855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.471118 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.471457 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:59.471426497 +0000 UTC m=+1042.929458088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:55 crc kubenswrapper[4795]: I0320 17:34:55.674603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.674834 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:55 crc kubenswrapper[4795]: E0320 17:34:55.674945 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:34:59.674916192 +0000 UTC m=+1043.132947763 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: I0320 17:34:56.255212 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:56 crc kubenswrapper[4795]: I0320 17:34:56.255444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.255590 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.256084 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:00.256057875 +0000 UTC m=+1043.714089416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.255667 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:34:56 crc kubenswrapper[4795]: E0320 17:34:56.256550 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:00.25653864 +0000 UTC m=+1043.714570181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: I0320 17:34:59.517826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.518048 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.518231 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:07.51821324 +0000 UTC m=+1050.976244791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: I0320 17:34:59.720238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.720470 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:34:59 crc kubenswrapper[4795]: E0320 17:34:59.720582 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:07.720559899 +0000 UTC m=+1051.178591460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: I0320 17:35:00.329070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329141 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329447 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:08.329395846 +0000 UTC m=+1051.787427427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: I0320 17:35:00.329780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329871 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:35:00 crc kubenswrapper[4795]: E0320 17:35:00.329914 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:08.329901892 +0000 UTC m=+1051.787933433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.357719 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.358572 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmklc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-6hsxn_openstack-operators(84901a7b-ddbf-47d9-954f-c167cd9cd46c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.359732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" podUID="84901a7b-ddbf-47d9-954f-c167cd9cd46c" Mar 20 17:35:05 crc kubenswrapper[4795]: E0320 17:35:05.658555 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" podUID="84901a7b-ddbf-47d9-954f-c167cd9cd46c" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.646360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" event={"ID":"ded84ba8-d70a-4379-bc80-d142e5306cc7","Type":"ContainerStarted","Data":"f0111d95524df63d092078718c2b8a07e14489ced1dc928f4a37eb94743523f9"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.646786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.648331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" event={"ID":"750d9405-0514-4876-821e-9ab1f6871e87","Type":"ContainerStarted","Data":"2a4ceef1025241bf7813f32b31cfbacb69561fa89646d59611274bee301e0401"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.648458 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.650883 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" event={"ID":"a957ef3d-357c-4aa4-865c-533f889257d7","Type":"ContainerStarted","Data":"30abe1dd0f5e01ba987f9078ad4b3e27b3791baf5cde76eb547bde20295fbe43"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.651031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.653363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" event={"ID":"e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38","Type":"ContainerStarted","Data":"fc7412ccaabdbde63c329639200c1eb2ed5b06f046a381f482af3feaac0a1e39"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.653438 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.654444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" event={"ID":"21481bba-04ec-47ce-95d0-fe27787a3d62","Type":"ContainerStarted","Data":"3ee9b8b9b039c452711c96b2f53386f9707dd961532dde1cc7c8927377e9c1d1"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.654505 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.658579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" event={"ID":"4cdd16c5-b7d3-4c52-a286-f3555daf43d9","Type":"ContainerStarted","Data":"77bf96924589b3fed6d330e5205405e198cf9640b6c8198a58ad91fa24fe6656"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.659098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.661487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" event={"ID":"b47e6216-2e29-4d58-8b0c-5970aee6307b","Type":"ContainerStarted","Data":"04685a046ace1ea9da0b1d9c67dfbec679438d93bb48314c6623e6fcfd082f5e"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.661874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.663248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" event={"ID":"0da03e08-561c-4b5f-89c7-af80c8f39f54","Type":"ContainerStarted","Data":"4ffd2799e63824e453db8e53879d5a9b721bc69e1b92017ad01c21c56ba64b6a"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.663769 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.666073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" event={"ID":"9cba9cd3-4144-4262-82a2-f2330793aae6","Type":"ContainerStarted","Data":"87b55f790e0a50388374581f9ef8fa9590a309cf494bfbe7676033e0dc4d888e"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.666453 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.670042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" event={"ID":"43804d6b-2358-46fd-bf04-26b2308f8ab0","Type":"ContainerStarted","Data":"f2389ea7871661d9bbdb0841ab68938a9508a3c849280c59cf93da116aa16e0e"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.670172 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.684892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" event={"ID":"afefdb79-bad6-4deb-904b-515174cca414","Type":"ContainerStarted","Data":"16661efb6b481c698cededa1a48a6989cae35f19d2763d45838111ab0f197459"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.684998 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.687268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" event={"ID":"d4ff6977-1303-4267-983e-3e99935f2aae","Type":"ContainerStarted","Data":"e4898a17581b68a6af81ca0110fc8504408b90f6239f76b0c7055b50073890a2"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.687363 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.688379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" event={"ID":"0ffe016b-8919-4b8f-839c-669637b7accc","Type":"ContainerStarted","Data":"0b049817003f40f212055703b53b1d9a99200e2dd68a291a3c37a3e57f322917"} Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.688511 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.705213 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" podStartSLOduration=3.020427446 podStartE2EDuration="15.705195096s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.699985958 +0000 UTC m=+1036.158017499" lastFinishedPulling="2026-03-20 17:35:05.384753608 +0000 UTC m=+1048.842785149" observedRunningTime="2026-03-20 17:35:06.681177458 +0000 UTC m=+1050.139208999" watchObservedRunningTime="2026-03-20 17:35:06.705195096 +0000 UTC m=+1050.163226637" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.706393 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" podStartSLOduration=3.059385927 podStartE2EDuration="15.706388634s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.680908406 +0000 UTC m=+1036.138939937" lastFinishedPulling="2026-03-20 17:35:05.327911103 +0000 UTC m=+1048.785942644" observedRunningTime="2026-03-20 17:35:06.701303763 +0000 UTC m=+1050.159335304" watchObservedRunningTime="2026-03-20 17:35:06.706388634 +0000 UTC m=+1050.164420175" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.726817 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" podStartSLOduration=3.445167239 podStartE2EDuration="15.726803759s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.101087314 +0000 UTC m=+1036.559118855" lastFinishedPulling="2026-03-20 17:35:05.382723814 +0000 UTC m=+1048.840755375" observedRunningTime="2026-03-20 17:35:06.72306164 +0000 UTC m=+1050.181093171" watchObservedRunningTime="2026-03-20 17:35:06.726803759 +0000 UTC m=+1050.184835300" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.748046 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" podStartSLOduration=3.316700342 podStartE2EDuration="15.748029159s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.901536513 +0000 UTC m=+1036.359568054" lastFinishedPulling="2026-03-20 17:35:05.33286532 +0000 UTC m=+1048.790896871" observedRunningTime="2026-03-20 17:35:06.745198299 +0000 UTC m=+1050.203229840" watchObservedRunningTime="2026-03-20 17:35:06.748029159 +0000 UTC m=+1050.206060700" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.810500 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" podStartSLOduration=3.298114225 podStartE2EDuration="15.810483051s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.858836864 +0000 UTC m=+1036.316868405" lastFinishedPulling="2026-03-20 17:35:05.37120568 +0000 UTC m=+1048.829237231" observedRunningTime="2026-03-20 17:35:06.809301403 +0000 UTC m=+1050.267332944" watchObservedRunningTime="2026-03-20 17:35:06.810483051 +0000 UTC m=+1050.268514592" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.810789 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" podStartSLOduration=3.803452324 podStartE2EDuration="15.810786001s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.370694358 +0000 UTC m=+1036.828725899" lastFinishedPulling="2026-03-20 17:35:05.378028035 +0000 UTC m=+1048.836059576" observedRunningTime="2026-03-20 17:35:06.774847415 +0000 UTC m=+1050.232878956" watchObservedRunningTime="2026-03-20 17:35:06.810786001 +0000 UTC m=+1050.268817542" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.831009 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" podStartSLOduration=3.757954896 podStartE2EDuration="15.830991938s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.271614549 +0000 UTC m=+1036.729646090" lastFinishedPulling="2026-03-20 17:35:05.344651591 +0000 UTC m=+1048.802683132" observedRunningTime="2026-03-20 17:35:06.825216366 +0000 UTC m=+1050.283247907" watchObservedRunningTime="2026-03-20 17:35:06.830991938 +0000 UTC m=+1050.289023479" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.848382 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" podStartSLOduration=3.581339759 podStartE2EDuration="15.848366607s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.102983774 +0000 UTC m=+1036.561015315" lastFinishedPulling="2026-03-20 17:35:05.370010622 +0000 UTC m=+1048.828042163" observedRunningTime="2026-03-20 17:35:06.842002446 +0000 UTC m=+1050.300033987" watchObservedRunningTime="2026-03-20 17:35:06.848366607 +0000 UTC m=+1050.306398138" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.860751 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" podStartSLOduration=3.126200065 podStartE2EDuration="15.860733588s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.598355068 +0000 UTC m=+1036.056386609" lastFinishedPulling="2026-03-20 17:35:05.332888591 +0000 UTC m=+1048.790920132" observedRunningTime="2026-03-20 17:35:06.856153143 +0000 UTC m=+1050.314184694" watchObservedRunningTime="2026-03-20 17:35:06.860733588 +0000 UTC m=+1050.318765129" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.874887 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" podStartSLOduration=3.093441453 podStartE2EDuration="15.874875495s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.563494728 +0000 UTC m=+1036.021526269" lastFinishedPulling="2026-03-20 17:35:05.34492874 +0000 UTC m=+1048.802960311" observedRunningTime="2026-03-20 17:35:06.873458669 +0000 UTC m=+1050.331490210" watchObservedRunningTime="2026-03-20 17:35:06.874875495 +0000 UTC m=+1050.332907026" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.925786 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" podStartSLOduration=3.61365309 podStartE2EDuration="15.925771452s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.08163034 +0000 UTC m=+1036.539661901" lastFinishedPulling="2026-03-20 17:35:05.393748722 +0000 UTC m=+1048.851780263" observedRunningTime="2026-03-20 17:35:06.908413744 +0000 UTC m=+1050.366445275" watchObservedRunningTime="2026-03-20 17:35:06.925771452 +0000 UTC m=+1050.383802993" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.926927 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" podStartSLOduration=3.509807539 podStartE2EDuration="15.926922878s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.915745241 +0000 UTC m=+1036.373776782" lastFinishedPulling="2026-03-20 17:35:05.33286058 +0000 UTC m=+1048.790892121" observedRunningTime="2026-03-20 17:35:06.924239013 +0000 UTC m=+1050.382270554" watchObservedRunningTime="2026-03-20 17:35:06.926922878 +0000 UTC m=+1050.384954419" Mar 20 17:35:06 crc kubenswrapper[4795]: I0320 17:35:06.948198 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" podStartSLOduration=3.2984648659999998 podStartE2EDuration="15.94818447s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.721266889 +0000 UTC m=+1036.179298430" lastFinishedPulling="2026-03-20 17:35:05.370986453 +0000 UTC m=+1048.829018034" observedRunningTime="2026-03-20 17:35:06.945234557 +0000 UTC m=+1050.403266088" watchObservedRunningTime="2026-03-20 17:35:06.94818447 +0000 UTC m=+1050.406216011" Mar 20 17:35:07 crc kubenswrapper[4795]: I0320 17:35:07.579126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.579360 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.579470 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert podName:fc0f2e63-50dd-424e-af01-3d09c9edd5b3 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:23.579440684 +0000 UTC m=+1067.037472265 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-lrjfh" (UID: "fc0f2e63-50dd-424e-af01-3d09c9edd5b3") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:35:07 crc kubenswrapper[4795]: I0320 17:35:07.781949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.783233 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:35:07 crc kubenswrapper[4795]: E0320 17:35:07.783288 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert podName:a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2 nodeName:}" failed. No retries permitted until 2026-03-20 17:35:23.78327148 +0000 UTC m=+1067.241303031 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f557zsq" (UID: "a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: I0320 17:35:08.388232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.388388 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.388790 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:24.388765321 +0000 UTC m=+1067.846796902 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "metrics-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: I0320 17:35:08.389441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.389570 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:35:08 crc kubenswrapper[4795]: E0320 17:35:08.389634 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs podName:0d8b26db-957e-4c0e-bb22-42f12d5beb0b nodeName:}" failed. No retries permitted until 2026-03-20 17:35:24.389618508 +0000 UTC m=+1067.847650079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-px2ft" (UID: "0d8b26db-957e-4c0e-bb22-42f12d5beb0b") : secret "webhook-server-cert" not found Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.299966 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.300476 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.300554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.301663 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.301825 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc" gracePeriod=600 Mar 20 17:35:11 crc kubenswrapper[4795]: E0320 17:35:11.506116 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8702afd1_abd3_42d0_91e6_048802e98829.slice/crio-conmon-f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.733019 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc" exitCode=0 Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.733074 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc"} Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.733109 4795 scope.go:117] "RemoveContainer" containerID="c7c7346cecb8a28bb2f3ea59134c537794aaad7d77f08b1e0c26a58bb7193842" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.870625 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-5hzvs" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.885338 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-h9f9t" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.907383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-jgs27" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.997244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-dwx6n" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.998556 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-rmcrf" Mar 20 17:35:11 crc kubenswrapper[4795]: I0320 17:35:11.999508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-f74p9" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.063293 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-55vp5" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.181492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-bqzcz" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.217568 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5v5sg" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.230867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-n7cl7" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.353245 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-6cw7v" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.440956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-828jr" Mar 20 17:35:12 crc kubenswrapper[4795]: I0320 17:35:12.537571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-rv5df" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.784149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" event={"ID":"071f0af8-4164-4f95-b0ee-720e3b3097f3","Type":"ContainerStarted","Data":"08aa33d7d50ce89e706ba6ddaecbe433defc8310c802c09b18ddb8d59573c8d5"} Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.785109 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.789333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd"} Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.792491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" event={"ID":"7a887d91-fa86-45d2-a6be-aa7326f7d544","Type":"ContainerStarted","Data":"7b216096aef2bce4b6237b5c7e8583113b01018eeee356898b18536631190edc"} Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.792928 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.820598 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" podStartSLOduration=2.845517042 podStartE2EDuration="24.820578279s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.103196001 +0000 UTC m=+1036.561227542" lastFinishedPulling="2026-03-20 17:35:15.078257238 +0000 UTC m=+1058.536288779" observedRunningTime="2026-03-20 17:35:15.818054199 +0000 UTC m=+1059.276085750" watchObservedRunningTime="2026-03-20 17:35:15.820578279 +0000 UTC m=+1059.278609820" Mar 20 17:35:15 crc kubenswrapper[4795]: I0320 17:35:15.821846 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" podStartSLOduration=2.846004498 podStartE2EDuration="24.821840119s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.103132759 +0000 UTC m=+1036.561164300" lastFinishedPulling="2026-03-20 17:35:15.07896838 +0000 UTC m=+1058.536999921" observedRunningTime="2026-03-20 17:35:15.801899209 +0000 UTC m=+1059.259930760" watchObservedRunningTime="2026-03-20 17:35:15.821840119 +0000 UTC m=+1059.279871650" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.824117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" event={"ID":"46248665-6f9f-46e0-8db7-6be8c47cf521","Type":"ContainerStarted","Data":"57ca24c7cdcdd3ae226b34644776b3fe2e822c9dc0a38813764c20e3a03aa9c9"} Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.824647 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.826663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" event={"ID":"84a19583-b173-4fb9-8b83-d9c41a5faf79","Type":"ContainerStarted","Data":"5bd228d6ce97c7a60b4988a6ee3699becc486c8687f6bd8bafd4edcb8718562e"} Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.827001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.828675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" event={"ID":"933bcfd5-f2d1-404f-876d-1d3da597f415","Type":"ContainerStarted","Data":"dccf44000382af16a69bc6c4de3ec06749846b5a3dd292e472c90cab8ba57ef8"} Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.828927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.854858 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" podStartSLOduration=3.378006848 podStartE2EDuration="27.854835008s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.278264769 +0000 UTC m=+1036.736296320" lastFinishedPulling="2026-03-20 17:35:17.755092939 +0000 UTC m=+1061.213124480" observedRunningTime="2026-03-20 17:35:18.84732041 +0000 UTC m=+1062.305351951" watchObservedRunningTime="2026-03-20 17:35:18.854835008 +0000 UTC m=+1062.312866579" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.870773 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" podStartSLOduration=2.397777453 podStartE2EDuration="26.87075417s" podCreationTimestamp="2026-03-20 17:34:52 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.279451667 +0000 UTC m=+1036.737483218" lastFinishedPulling="2026-03-20 17:35:17.752428394 +0000 UTC m=+1061.210459935" observedRunningTime="2026-03-20 17:35:18.866139284 +0000 UTC m=+1062.324170825" watchObservedRunningTime="2026-03-20 17:35:18.87075417 +0000 UTC m=+1062.328785711" Mar 20 17:35:18 crc kubenswrapper[4795]: I0320 17:35:18.896349 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" podStartSLOduration=3.216404215 podStartE2EDuration="27.896330708s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.108931042 +0000 UTC m=+1036.566962583" lastFinishedPulling="2026-03-20 17:35:17.788857535 +0000 UTC m=+1061.246889076" observedRunningTime="2026-03-20 17:35:18.891818485 +0000 UTC m=+1062.349850106" watchObservedRunningTime="2026-03-20 17:35:18.896330708 +0000 UTC m=+1062.354362249" Mar 20 17:35:20 crc kubenswrapper[4795]: I0320 17:35:20.844576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" event={"ID":"84901a7b-ddbf-47d9-954f-c167cd9cd46c","Type":"ContainerStarted","Data":"003cafa21df19b3cddfac02d428915e46f4c5ad948b67de03d0d9f02908a76cb"} Mar 20 17:35:20 crc kubenswrapper[4795]: I0320 17:35:20.845229 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:35:20 crc kubenswrapper[4795]: I0320 17:35:20.867205 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" podStartSLOduration=3.099653678 podStartE2EDuration="29.867182595s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:34:52.905763816 +0000 UTC m=+1036.363795357" lastFinishedPulling="2026-03-20 17:35:19.673292703 +0000 UTC m=+1063.131324274" observedRunningTime="2026-03-20 17:35:20.863169068 +0000 UTC m=+1064.321200639" watchObservedRunningTime="2026-03-20 17:35:20.867182595 +0000 UTC m=+1064.325214146" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.076964 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-trjt4" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.144762 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-jfdzb" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.289901 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-dtfmz" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.506588 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-jbwss" Mar 20 17:35:22 crc kubenswrapper[4795]: I0320 17:35:22.559761 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-6z7j5" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.646612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.655141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc0f2e63-50dd-424e-af01-3d09c9edd5b3-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-lrjfh\" (UID: \"fc0f2e63-50dd-424e-af01-3d09c9edd5b3\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.817923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.848590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:23 crc kubenswrapper[4795]: I0320 17:35:23.854265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f557zsq\" (UID: \"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.060043 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh"] Mar 20 17:35:24 crc kubenswrapper[4795]: W0320 17:35:24.065894 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0f2e63_50dd_424e_af01_3d09c9edd5b3.slice/crio-3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d WatchSource:0}: Error finding container 3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d: Status 404 returned error can't find the container with id 3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.103893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.459936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.460050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.467105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.467105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0d8b26db-957e-4c0e-bb22-42f12d5beb0b-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-px2ft\" (UID: \"0d8b26db-957e-4c0e-bb22-42f12d5beb0b\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.557043 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq"] Mar 20 17:35:24 crc kubenswrapper[4795]: W0320 17:35:24.570963 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fa84d9_bfa8_4b4a_82d7_51e5ae87e0d2.slice/crio-3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194 WatchSource:0}: Error finding container 3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194: Status 404 returned error can't find the container with id 3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194 Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.739927 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.875237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" event={"ID":"fc0f2e63-50dd-424e-af01-3d09c9edd5b3","Type":"ContainerStarted","Data":"3382651b40d09007d4926d89d27a532691817b881fa456fc91e8b5baa43edf3d"} Mar 20 17:35:24 crc kubenswrapper[4795]: I0320 17:35:24.876202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" event={"ID":"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2","Type":"ContainerStarted","Data":"3bfbe7707b228d5b66b36a421d9d0c5aa49762413afa1993eb2a6f49f3846194"} Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.001890 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft"] Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.884621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" event={"ID":"0d8b26db-957e-4c0e-bb22-42f12d5beb0b","Type":"ContainerStarted","Data":"8bbd8622b4d04376a565d7c90ace4706045eef4cd55459181da6b771c44772d3"} Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.884939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" event={"ID":"0d8b26db-957e-4c0e-bb22-42f12d5beb0b","Type":"ContainerStarted","Data":"c93cffcb3922186ecbb0e79bbbca9058705018e87e04677df7aaa4e3ff73a1ce"} Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.884953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:25 crc kubenswrapper[4795]: I0320 17:35:25.918887 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" podStartSLOduration=33.918868231 podStartE2EDuration="33.918868231s" podCreationTimestamp="2026-03-20 17:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:25.909733554 +0000 UTC m=+1069.367765095" watchObservedRunningTime="2026-03-20 17:35:25.918868231 +0000 UTC m=+1069.376899772" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.901601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" event={"ID":"fc0f2e63-50dd-424e-af01-3d09c9edd5b3","Type":"ContainerStarted","Data":"cb585f5bbb6c67204cf96966227bb2b3b84d639644885462b81b157f3f3de829"} Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.902056 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.903504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" event={"ID":"a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2","Type":"ContainerStarted","Data":"6cb9b7e64562f4a00861acdd8a538d1b881bacc3e584125bb799f3c0d02c4d80"} Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.903868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.931890 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" podStartSLOduration=34.238478526 podStartE2EDuration="36.931866039s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:35:24.067869109 +0000 UTC m=+1067.525900660" lastFinishedPulling="2026-03-20 17:35:26.761256622 +0000 UTC m=+1070.219288173" observedRunningTime="2026-03-20 17:35:27.92557803 +0000 UTC m=+1071.383609601" watchObservedRunningTime="2026-03-20 17:35:27.931866039 +0000 UTC m=+1071.389897610" Mar 20 17:35:27 crc kubenswrapper[4795]: I0320 17:35:27.952029 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" podStartSLOduration=34.750588686 podStartE2EDuration="36.952009025s" podCreationTimestamp="2026-03-20 17:34:51 +0000 UTC" firstStartedPulling="2026-03-20 17:35:24.57330806 +0000 UTC m=+1068.031339641" lastFinishedPulling="2026-03-20 17:35:26.774728449 +0000 UTC m=+1070.232759980" observedRunningTime="2026-03-20 17:35:27.948444502 +0000 UTC m=+1071.406476083" watchObservedRunningTime="2026-03-20 17:35:27.952009025 +0000 UTC m=+1071.410040586" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.076661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-6hsxn" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.887296 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.890894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.892811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.998620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.998781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:32 crc kubenswrapper[4795]: I0320 17:35:32.998828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.100924 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.101158 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.120605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"redhat-marketplace-9sc2c\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.222790 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:33 crc kubenswrapper[4795]: W0320 17:35:33.671919 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a28038_950d_4a3f_bbac_2084cf8e0ac4.slice/crio-341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595 WatchSource:0}: Error finding container 341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595: Status 404 returned error can't find the container with id 341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595 Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.680974 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.827651 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-lrjfh" Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.956450 4795 generic.go:334] "Generic (PLEG): container finished" podID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" exitCode=0 Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.956497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf"} Mar 20 17:35:33 crc kubenswrapper[4795]: I0320 17:35:33.956527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerStarted","Data":"341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595"} Mar 20 17:35:34 crc kubenswrapper[4795]: I0320 17:35:34.113199 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f557zsq" Mar 20 17:35:34 crc kubenswrapper[4795]: I0320 17:35:34.750756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-px2ft" Mar 20 17:35:34 crc kubenswrapper[4795]: I0320 17:35:34.967331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerStarted","Data":"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670"} Mar 20 17:35:35 crc kubenswrapper[4795]: I0320 17:35:35.983518 4795 generic.go:334] "Generic (PLEG): container finished" podID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" exitCode=0 Mar 20 17:35:35 crc kubenswrapper[4795]: I0320 17:35:35.983741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670"} Mar 20 17:35:36 crc kubenswrapper[4795]: I0320 17:35:36.999036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerStarted","Data":"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400"} Mar 20 17:35:37 crc kubenswrapper[4795]: I0320 17:35:37.038509 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9sc2c" podStartSLOduration=2.559722849 podStartE2EDuration="5.038475774s" podCreationTimestamp="2026-03-20 17:35:32 +0000 UTC" firstStartedPulling="2026-03-20 17:35:33.957960046 +0000 UTC m=+1077.415991607" lastFinishedPulling="2026-03-20 17:35:36.436712961 +0000 UTC m=+1079.894744532" observedRunningTime="2026-03-20 17:35:37.026213338 +0000 UTC m=+1080.484244919" watchObservedRunningTime="2026-03-20 17:35:37.038475774 +0000 UTC m=+1080.496507365" Mar 20 17:35:43 crc kubenswrapper[4795]: I0320 17:35:43.223065 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:43 crc kubenswrapper[4795]: I0320 17:35:43.223771 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:43 crc kubenswrapper[4795]: I0320 17:35:43.300000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:44 crc kubenswrapper[4795]: I0320 17:35:44.137196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:44 crc kubenswrapper[4795]: I0320 17:35:44.190942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.083552 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9sc2c" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" containerID="cri-o://c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" gracePeriod=2 Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.585904 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.607635 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") pod \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.607725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") pod \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.607788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") pod \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\" (UID: \"28a28038-950d-4a3f-bbac-2084cf8e0ac4\") " Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.608879 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities" (OuterVolumeSpecName: "utilities") pod "28a28038-950d-4a3f-bbac-2084cf8e0ac4" (UID: "28a28038-950d-4a3f-bbac-2084cf8e0ac4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.609784 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.616006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9" (OuterVolumeSpecName: "kube-api-access-7vbh9") pod "28a28038-950d-4a3f-bbac-2084cf8e0ac4" (UID: "28a28038-950d-4a3f-bbac-2084cf8e0ac4"). InnerVolumeSpecName "kube-api-access-7vbh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.654069 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28a28038-950d-4a3f-bbac-2084cf8e0ac4" (UID: "28a28038-950d-4a3f-bbac-2084cf8e0ac4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.711015 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a28038-950d-4a3f-bbac-2084cf8e0ac4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:46 crc kubenswrapper[4795]: I0320 17:35:46.711053 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vbh9\" (UniqueName: \"kubernetes.io/projected/28a28038-950d-4a3f-bbac-2084cf8e0ac4-kube-api-access-7vbh9\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.097341 4795 generic.go:334] "Generic (PLEG): container finished" podID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" exitCode=0 Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.097449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sc2c" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.097435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400"} Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.098041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sc2c" event={"ID":"28a28038-950d-4a3f-bbac-2084cf8e0ac4","Type":"ContainerDied","Data":"341ceaef6f78ac3694a26b1214c08ac55be85fcf2a82904786498c5cb6d2a595"} Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.098086 4795 scope.go:117] "RemoveContainer" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.135812 4795 scope.go:117] "RemoveContainer" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.155042 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.161938 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sc2c"] Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.173295 4795 scope.go:117] "RemoveContainer" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.219851 4795 scope.go:117] "RemoveContainer" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" Mar 20 17:35:47 crc kubenswrapper[4795]: E0320 17:35:47.220370 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400\": container with ID starting with c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400 not found: ID does not exist" containerID="c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.220437 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400"} err="failed to get container status \"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400\": rpc error: code = NotFound desc = could not find container \"c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400\": container with ID starting with c544e959f87d22df7344c1df3ade570f1af5a70091772673d120774fcc686400 not found: ID does not exist" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.220476 4795 scope.go:117] "RemoveContainer" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" Mar 20 17:35:47 crc kubenswrapper[4795]: E0320 17:35:47.221183 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670\": container with ID starting with 4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670 not found: ID does not exist" containerID="4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.221238 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670"} err="failed to get container status \"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670\": rpc error: code = NotFound desc = could not find container \"4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670\": container with ID starting with 4a8f3dab5f6a81aa1c9a73f548ace405406d3f27f4650ecf72686086458b6670 not found: ID does not exist" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.221274 4795 scope.go:117] "RemoveContainer" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" Mar 20 17:35:47 crc kubenswrapper[4795]: E0320 17:35:47.221681 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf\": container with ID starting with a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf not found: ID does not exist" containerID="a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.221738 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf"} err="failed to get container status \"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf\": rpc error: code = NotFound desc = could not find container \"a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf\": container with ID starting with a1a001379c9fe8847f938940b76dcfe89fe522bcf0272b491b762a7fdf4526bf not found: ID does not exist" Mar 20 17:35:47 crc kubenswrapper[4795]: I0320 17:35:47.263828 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" path="/var/lib/kubelet/pods/28a28038-950d-4a3f-bbac-2084cf8e0ac4/volumes" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.585733 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:53 crc kubenswrapper[4795]: E0320 17:35:53.586482 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-utilities" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586495 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-utilities" Mar 20 17:35:53 crc kubenswrapper[4795]: E0320 17:35:53.586504 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-content" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586510 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="extract-content" Mar 20 17:35:53 crc kubenswrapper[4795]: E0320 17:35:53.586521 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586560 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.586676 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a28038-950d-4a3f-bbac-2084cf8e0ac4" containerName="registry-server" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.587347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.594596 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.594896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kk7jg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.595341 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.595456 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.605340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.615328 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.616397 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.617940 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.642352 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.728876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.728941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.729499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.729568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.729665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.830846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.831823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.832094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.832451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.857567 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"dnsmasq-dns-78dd6ddcc-t4qxc\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.866455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"dnsmasq-dns-675f4bcbfc-2zkfg\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.904482 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:35:53 crc kubenswrapper[4795]: I0320 17:35:53.930547 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:35:54 crc kubenswrapper[4795]: I0320 17:35:54.371724 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:54 crc kubenswrapper[4795]: W0320 17:35:54.426993 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11fbdcb2_cc31_4fe8_be5f_80df050a7a93.slice/crio-c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f WatchSource:0}: Error finding container c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f: Status 404 returned error can't find the container with id c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f Mar 20 17:35:54 crc kubenswrapper[4795]: I0320 17:35:54.427472 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:55 crc kubenswrapper[4795]: I0320 17:35:55.160718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" event={"ID":"11fbdcb2-cc31-4fe8-be5f-80df050a7a93","Type":"ContainerStarted","Data":"c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f"} Mar 20 17:35:55 crc kubenswrapper[4795]: I0320 17:35:55.162519 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" event={"ID":"7f272673-089e-4e0d-ad79-ee04004f6c62","Type":"ContainerStarted","Data":"c52b78bf8dbc75f740fc999798595970e27f414c3577a45dac0538a7d22d794f"} Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.173720 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.195738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.197561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.201535 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.272516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.272625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.272657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.374377 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.374438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.374472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.375431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.375917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.400292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"dnsmasq-dns-666b6646f7-w4cr8\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.513551 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.521320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.560772 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.562629 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.567149 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.584659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.584957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.585068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.819296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.819574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.819599 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.820364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.820891 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.857840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"dnsmasq-dns-57d769cc4f-gg8p4\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:56 crc kubenswrapper[4795]: I0320 17:35:56.944018 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.158243 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.168967 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.170233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.172173 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.172385 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-84wwv" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.172572 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173416 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173442 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.173501 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.184968 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.224989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.225065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.326873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327131 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.327861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.329608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.330137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.330862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.331652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.332563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.333807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.342491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.349533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.396310 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.457723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.459467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.465362 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.465443 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466202 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466358 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466514 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.466860 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pf5bc" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.483857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.498569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530151 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.530453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.631623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.633351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.633744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.634060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.634175 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.636671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.643764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.653620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.660365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.661255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.661275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.662834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.663682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:57 crc kubenswrapper[4795]: I0320 17:35:57.831654 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.196131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerStarted","Data":"eceba19972887264330468d5439623bcd97b3e95bfc235711a65beb5805973f3"} Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.877746 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.879117 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.892136 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.897977 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.903050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.903781 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5tdxw" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.904106 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 17:35:58 crc kubenswrapper[4795]: I0320 17:35:58.913719 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwzw\" (UniqueName: \"kubernetes.io/projected/0f5a24ef-fc80-4386-9f81-5f21154223f3-kube-api-access-npwzw\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.051815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153280 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153399 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153552 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwzw\" (UniqueName: \"kubernetes.io/projected/0f5a24ef-fc80-4386-9f81-5f21154223f3-kube-api-access-npwzw\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.153826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.154442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.154584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.165414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5a24ef-fc80-4386-9f81-5f21154223f3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.169545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.176356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwzw\" (UniqueName: \"kubernetes.io/projected/0f5a24ef-fc80-4386-9f81-5f21154223f3-kube-api-access-npwzw\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.177123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.179444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f5a24ef-fc80-4386-9f81-5f21154223f3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f5a24ef-fc80-4386-9f81-5f21154223f3\") " pod="openstack/openstack-galera-0" Mar 20 17:35:59 crc kubenswrapper[4795]: I0320 17:35:59.242456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.136777 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.138207 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.142231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.142408 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.143108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.152437 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.270734 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"auto-csr-approver-29567136-j4mtv\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.311877 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.313904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.317050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vttrp" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.317320 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.320210 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.320502 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.328158 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.372456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"auto-csr-approver-29567136-j4mtv\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.405431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"auto-csr-approver-29567136-j4mtv\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.473912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.474846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwkw\" (UniqueName: \"kubernetes.io/projected/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kube-api-access-zhwkw\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.475072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.475240 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.475437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.491129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwkw\" (UniqueName: \"kubernetes.io/projected/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kube-api-access-zhwkw\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.576987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.577023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.577412 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.578672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.577819 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.579329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.580006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.580320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.583267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.614190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwkw\" (UniqueName: \"kubernetes.io/projected/6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987-kube-api-access-zhwkw\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.617908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.631778 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.776907 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.778086 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.782576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.782879 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.783087 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4ttzr" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.790259 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-kolla-config\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t84rh\" (UniqueName: \"kubernetes.io/projected/ca95ec62-fce9-4c91-bb59-fa80f512edba-kube-api-access-t84rh\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-config-data\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.881657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.983975 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t84rh\" (UniqueName: \"kubernetes.io/projected/ca95ec62-fce9-4c91-bb59-fa80f512edba-kube-api-access-t84rh\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-config-data\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.984455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-kolla-config\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.985459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-kolla-config\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.985910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca95ec62-fce9-4c91-bb59-fa80f512edba-config-data\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.989476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:00 crc kubenswrapper[4795]: I0320 17:36:00.995049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95ec62-fce9-4c91-bb59-fa80f512edba-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:01 crc kubenswrapper[4795]: I0320 17:36:01.001003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t84rh\" (UniqueName: \"kubernetes.io/projected/ca95ec62-fce9-4c91-bb59-fa80f512edba-kube-api-access-t84rh\") pod \"memcached-0\" (UID: \"ca95ec62-fce9-4c91-bb59-fa80f512edba\") " pod="openstack/memcached-0" Mar 20 17:36:01 crc kubenswrapper[4795]: I0320 17:36:01.100827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:36:01 crc kubenswrapper[4795]: I0320 17:36:01.263276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerStarted","Data":"a6075430a5a7a42dcee3b92020556eee6821261a85421f6ff8cc34985b56804c"} Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.779959 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.782230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.784367 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dwl8l" Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.798984 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:36:02 crc kubenswrapper[4795]: I0320 17:36:02.932146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"kube-state-metrics-0\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:36:03 crc kubenswrapper[4795]: I0320 17:36:03.035923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"kube-state-metrics-0\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:36:03 crc kubenswrapper[4795]: I0320 17:36:03.083905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"kube-state-metrics-0\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:36:03 crc kubenswrapper[4795]: I0320 17:36:03.168657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.061628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnp2g"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.062866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.065241 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tm86q" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.065462 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.065643 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.071161 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dsqcc"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.072668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.076956 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.098164 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dsqcc"] Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-log\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-ovn-controller-tls-certs\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5675bf5e-3a57-4082-8631-680ced6fb634-scripts\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-lib\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8j4m\" (UniqueName: \"kubernetes.io/projected/28df10bb-d6a9-47a9-9b79-0bb9665529ef-kube-api-access-h8j4m\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-run\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-log-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-etc-ovs\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrvv\" (UniqueName: \"kubernetes.io/projected/5675bf5e-3a57-4082-8631-680ced6fb634-kube-api-access-dfrvv\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28df10bb-d6a9-47a9-9b79-0bb9665529ef-scripts\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.190665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-combined-ca-bundle\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.291961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-combined-ca-bundle\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.291999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28df10bb-d6a9-47a9-9b79-0bb9665529ef-scripts\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-log\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-ovn-controller-tls-certs\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5675bf5e-3a57-4082-8631-680ced6fb634-scripts\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-lib\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8j4m\" (UniqueName: \"kubernetes.io/projected/28df10bb-d6a9-47a9-9b79-0bb9665529ef-kube-api-access-h8j4m\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-run\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292169 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-log-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-etc-ovs\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrvv\" (UniqueName: \"kubernetes.io/projected/5675bf5e-3a57-4082-8631-680ced6fb634-kube-api-access-dfrvv\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.292256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-log-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-etc-ovs\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-log\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-lib\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.294828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5675bf5e-3a57-4082-8631-680ced6fb634-var-run\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.296954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5675bf5e-3a57-4082-8631-680ced6fb634-scripts\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.297197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28df10bb-d6a9-47a9-9b79-0bb9665529ef-scripts\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.297321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/28df10bb-d6a9-47a9-9b79-0bb9665529ef-var-run-ovn\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.297892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-ovn-controller-tls-certs\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.308299 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8j4m\" (UniqueName: \"kubernetes.io/projected/28df10bb-d6a9-47a9-9b79-0bb9665529ef-kube-api-access-h8j4m\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.309783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df10bb-d6a9-47a9-9b79-0bb9665529ef-combined-ca-bundle\") pod \"ovn-controller-dnp2g\" (UID: \"28df10bb-d6a9-47a9-9b79-0bb9665529ef\") " pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.314094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrvv\" (UniqueName: \"kubernetes.io/projected/5675bf5e-3a57-4082-8631-680ced6fb634-kube-api-access-dfrvv\") pod \"ovn-controller-ovs-dsqcc\" (UID: \"5675bf5e-3a57-4082-8631-680ced6fb634\") " pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.402451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:06 crc kubenswrapper[4795]: I0320 17:36:06.404084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.839044 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.840631 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.843158 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.843219 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.844225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.844333 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.844646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6cdhb" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.864835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946259 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946638 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srznh\" (UniqueName: \"kubernetes.io/projected/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-kube-api-access-srznh\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.946969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:09 crc kubenswrapper[4795]: I0320 17:36:09.947520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.011406 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.012088 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbfdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2zkfg_openstack(7f272673-089e-4e0d-ad79-ee04004f6c62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.013878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" podUID="7f272673-089e-4e0d-ad79-ee04004f6c62" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.025636 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.025778 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmpx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-t4qxc_openstack(11fbdcb2-cc31-4fe8-be5f-80df050a7a93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:36:10 crc kubenswrapper[4795]: E0320 17:36:10.027043 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" podUID="11fbdcb2-cc31-4fe8-be5f-80df050a7a93" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srznh\" (UniqueName: \"kubernetes.io/projected/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-kube-api-access-srznh\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.049520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.050587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.051197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.051484 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.054646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-config\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.056510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.058512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.068967 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.070805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.072605 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ldxcv" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.075092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.076275 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.076516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.076658 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.077206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srznh\" (UniqueName: \"kubernetes.io/projected/3b9d4ac2-2b66-441a-a6d4-0d467d857f99-kube-api-access-srznh\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.091054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3b9d4ac2-2b66-441a-a6d4-0d467d857f99\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.098814 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d96tc\" (UniqueName: \"kubernetes.io/projected/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-kube-api-access-d96tc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.150834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.198391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.252966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d96tc\" (UniqueName: \"kubernetes.io/projected/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-kube-api-access-d96tc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.253365 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.254208 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.258336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.258430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.261216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.263701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.270080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.273930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d96tc\" (UniqueName: \"kubernetes.io/projected/c07f346e-3e6c-41a5-bdda-67a4a5f04ba7-kube-api-access-d96tc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.303789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.369460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.785124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.795921 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: W0320 17:36:10.798268 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f5a24ef_fc80_4386_9f81_5f21154223f3.slice/crio-7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c WatchSource:0}: Error finding container 7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c: Status 404 returned error can't find the container with id 7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.811766 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.842180 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.846559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.977909 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") pod \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") pod \"7f272673-089e-4e0d-ad79-ee04004f6c62\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987864 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") pod \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") pod \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\" (UID: \"11fbdcb2-cc31-4fe8-be5f-80df050a7a93\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.987939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") pod \"7f272673-089e-4e0d-ad79-ee04004f6c62\" (UID: \"7f272673-089e-4e0d-ad79-ee04004f6c62\") " Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.989316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config" (OuterVolumeSpecName: "config") pod "7f272673-089e-4e0d-ad79-ee04004f6c62" (UID: "7f272673-089e-4e0d-ad79-ee04004f6c62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.989619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11fbdcb2-cc31-4fe8-be5f-80df050a7a93" (UID: "11fbdcb2-cc31-4fe8-be5f-80df050a7a93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.993023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g"] Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.993094 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4" (OuterVolumeSpecName: "kube-api-access-fmpx4") pod "11fbdcb2-cc31-4fe8-be5f-80df050a7a93" (UID: "11fbdcb2-cc31-4fe8-be5f-80df050a7a93"). InnerVolumeSpecName "kube-api-access-fmpx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:10 crc kubenswrapper[4795]: I0320 17:36:10.993946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp" (OuterVolumeSpecName: "kube-api-access-mbfdp") pod "7f272673-089e-4e0d-ad79-ee04004f6c62" (UID: "7f272673-089e-4e0d-ad79-ee04004f6c62"). InnerVolumeSpecName "kube-api-access-mbfdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.001658 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config" (OuterVolumeSpecName: "config") pod "11fbdcb2-cc31-4fe8-be5f-80df050a7a93" (UID: "11fbdcb2-cc31-4fe8-be5f-80df050a7a93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.014633 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca95ec62_fce9_4c91_bb59_fa80f512edba.slice/crio-a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96 WatchSource:0}: Error finding container a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96: Status 404 returned error can't find the container with id a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.018327 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.035498 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.040409 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090173 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmpx4\" (UniqueName: \"kubernetes.io/projected/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-kube-api-access-fmpx4\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090209 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090222 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbfdp\" (UniqueName: \"kubernetes.io/projected/7f272673-089e-4e0d-ad79-ee04004f6c62-kube-api-access-mbfdp\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090234 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11fbdcb2-cc31-4fe8-be5f-80df050a7a93-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.090246 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f272673-089e-4e0d-ad79-ee04004f6c62-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.097033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.101767 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b9d4ac2_2b66_441a_a6d4_0d467d857f99.slice/crio-59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8 WatchSource:0}: Error finding container 59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8: Status 404 returned error can't find the container with id 59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8 Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.201081 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07f346e_3e6c_41a5_bdda_67a4a5f04ba7.slice/crio-5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9 WatchSource:0}: Error finding container 5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9: Status 404 returned error can't find the container with id 5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.201308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.323066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerStarted","Data":"7073a77e150e717db0aee45fa7ce5bfee04124e240591e313a106e97438ee45c"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.324243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g" event={"ID":"28df10bb-d6a9-47a9-9b79-0bb9665529ef","Type":"ContainerStarted","Data":"c01b1888a8b541ad374dbfa8a6411bcdf2e68ac5cfc9aaf1bb0d5e9bce548c26"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.325396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" event={"ID":"11fbdcb2-cc31-4fe8-be5f-80df050a7a93","Type":"ContainerDied","Data":"c17bc707b7e046c509096b18cdd373a490c6533a741e73bc3005dac5daab314f"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.325405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-t4qxc" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.326820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerStarted","Data":"0e5a7ece35e45546c5839b24c64b62f1f72a3acb63297d6f97fc0dca60bde01d"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.330286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" event={"ID":"38f88deb-b38d-4c52-a901-baeb9da08559","Type":"ContainerStarted","Data":"d53c776ab6a465f3a075842cf379886815e4d1caf766e21f6fcc37ed80564b8a"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.332018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7","Type":"ContainerStarted","Data":"5ed0c34c3b4eefa74c9608d1cd68f505f81bad8d95898aa642b6cd4ec42c4ff9"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.333150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ca95ec62-fce9-4c91-bb59-fa80f512edba","Type":"ContainerStarted","Data":"a2af658a384a60f6fe161b1f5cba4d8abda1a769472141968e9444e6dd01ee96"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.335232 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b9d4ac2-2b66-441a-a6d4-0d467d857f99","Type":"ContainerStarted","Data":"59347bda18896f5441b38e7dc9139e1ccfae99e35d69a3c62c0bdf1613d121b8"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.336138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerStarted","Data":"12a00ee882324adc5e7b3fa5833c8430141d6a20302db2d5f549cf873b0d421d"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.337263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" event={"ID":"7f272673-089e-4e0d-ad79-ee04004f6c62","Type":"ContainerDied","Data":"c52b78bf8dbc75f740fc999798595970e27f414c3577a45dac0538a7d22d794f"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.337375 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2zkfg" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.338510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerStarted","Data":"64b4d3bbfc53ca1c715ea49f2ff8de0536ae2c9c0fde4b3a6dd9f838d8a8b4c0"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.340449 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" exitCode=0 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.340748 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerDied","Data":"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.346887 4795 generic.go:334] "Generic (PLEG): container finished" podID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerID="bb90f0d50353c309b35e0e03c747d10c744c81a1b2cc09b5b5dec2672959655d" exitCode=0 Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.346973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerDied","Data":"bb90f0d50353c309b35e0e03c747d10c744c81a1b2cc09b5b5dec2672959655d"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.356804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerStarted","Data":"715534f72ece852c083764840657cce952ec7708ddcedcd00af2caddc251418f"} Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.419844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.432454 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-t4qxc"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.454171 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.458312 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2zkfg"] Mar 20 17:36:11 crc kubenswrapper[4795]: E0320 17:36:11.549983 4795 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 17:36:11 crc kubenswrapper[4795]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 17:36:11 crc kubenswrapper[4795]: > podSandboxID="eceba19972887264330468d5439623bcd97b3e95bfc235711a65beb5805973f3" Mar 20 17:36:11 crc kubenswrapper[4795]: E0320 17:36:11.550878 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:11 crc kubenswrapper[4795]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jgldg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-w4cr8_openstack(188f326f-74f0-423d-9ae1-54aae0c1474e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 17:36:11 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:36:11 crc kubenswrapper[4795]: E0320 17:36:11.552142 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" Mar 20 17:36:11 crc kubenswrapper[4795]: I0320 17:36:11.922897 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dsqcc"] Mar 20 17:36:11 crc kubenswrapper[4795]: W0320 17:36:11.932513 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5675bf5e_3a57_4082_8631_680ced6fb634.slice/crio-bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f WatchSource:0}: Error finding container bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f: Status 404 returned error can't find the container with id bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.366548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerStarted","Data":"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe"} Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.366765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.367767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"bbfec97bb460eff0d3d30f4022bdb401b51118cb284a06e54e7cb6d9a5314a7f"} Mar 20 17:36:12 crc kubenswrapper[4795]: I0320 17:36:12.388113 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" podStartSLOduration=7.317579223 podStartE2EDuration="16.388051601s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:36:01.051720565 +0000 UTC m=+1104.509752106" lastFinishedPulling="2026-03-20 17:36:10.122192943 +0000 UTC m=+1113.580224484" observedRunningTime="2026-03-20 17:36:12.382763455 +0000 UTC m=+1115.840794996" watchObservedRunningTime="2026-03-20 17:36:12.388051601 +0000 UTC m=+1115.846083162" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.278983 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fbdcb2-cc31-4fe8-be5f-80df050a7a93" path="/var/lib/kubelet/pods/11fbdcb2-cc31-4fe8-be5f-80df050a7a93/volumes" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.279676 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f272673-089e-4e0d-ad79-ee04004f6c62" path="/var/lib/kubelet/pods/7f272673-089e-4e0d-ad79-ee04004f6c62/volumes" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.378144 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerStarted","Data":"87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5"} Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.378436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:13 crc kubenswrapper[4795]: I0320 17:36:13.398064 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podStartSLOduration=4.487993861 podStartE2EDuration="17.398048982s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:35:57.20244868 +0000 UTC m=+1100.660480221" lastFinishedPulling="2026-03-20 17:36:10.112503801 +0000 UTC m=+1113.570535342" observedRunningTime="2026-03-20 17:36:13.394380557 +0000 UTC m=+1116.852412098" watchObservedRunningTime="2026-03-20 17:36:13.398048982 +0000 UTC m=+1116.856080523" Mar 20 17:36:16 crc kubenswrapper[4795]: I0320 17:36:16.945927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.004929 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.005126 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" containerID="cri-o://87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5" gracePeriod=10 Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.009927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.410609 4795 generic.go:334] "Generic (PLEG): container finished" podID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerID="87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4795]: I0320 17:36:17.410932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerDied","Data":"87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5"} Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.183495 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-n4gzx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.184364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.186191 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.205087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n4gzx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zng\" (UniqueName: \"kubernetes.io/projected/85004117-20bc-474e-88f5-ce49032749ff-kube-api-access-c7zng\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovn-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-combined-ca-bundle\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85004117-20bc-474e-88f5-ce49032749ff-config\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.344587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovs-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.359627 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.360819 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.363025 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.375988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446067 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7zng\" (UniqueName: \"kubernetes.io/projected/85004117-20bc-474e-88f5-ce49032749ff-kube-api-access-c7zng\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovn-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446461 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-combined-ca-bundle\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446540 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovn-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85004117-20bc-474e-88f5-ce49032749ff-config\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovs-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.446867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/85004117-20bc-474e-88f5-ce49032749ff-ovs-rundir\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.447472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85004117-20bc-474e-88f5-ce49032749ff-config\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.464811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-combined-ca-bundle\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.465179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/85004117-20bc-474e-88f5-ce49032749ff-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.470120 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:19 crc kubenswrapper[4795]: E0320 17:36:19.470663 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-gm462 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" podUID="5d9eecdd-6791-4b24-8855-6036767861cf" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.475233 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7zng\" (UniqueName: \"kubernetes.io/projected/85004117-20bc-474e-88f5-ce49032749ff-kube-api-access-c7zng\") pod \"ovn-controller-metrics-n4gzx\" (UID: \"85004117-20bc-474e-88f5-ce49032749ff\") " pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.500544 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.501788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.508429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-n4gzx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.511103 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.532658 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.548882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.548943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.548997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.549097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.551795 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.551872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.553235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.584762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"dnsmasq-dns-7f896c8c65-655xx\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.649996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.650471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.751935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.752864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.753022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.753102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.753336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.772450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"dnsmasq-dns-86db49b7ff-g9nkw\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:19 crc kubenswrapper[4795]: I0320 17:36:19.865620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.432981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.441810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.568837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") pod \"5d9eecdd-6791-4b24-8855-6036767861cf\" (UID: \"5d9eecdd-6791-4b24-8855-6036767861cf\") " Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.569544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.569801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config" (OuterVolumeSpecName: "config") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.570072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.575228 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462" (OuterVolumeSpecName: "kube-api-access-gm462") pod "5d9eecdd-6791-4b24-8855-6036767861cf" (UID: "5d9eecdd-6791-4b24-8855-6036767861cf"). InnerVolumeSpecName "kube-api-access-gm462". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671877 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671936 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671956 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d9eecdd-6791-4b24-8855-6036767861cf-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:20 crc kubenswrapper[4795]: I0320 17:36:20.671973 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm462\" (UniqueName: \"kubernetes.io/projected/5d9eecdd-6791-4b24-8855-6036767861cf-kube-api-access-gm462\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:21 crc kubenswrapper[4795]: I0320 17:36:21.440501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-655xx" Mar 20 17:36:21 crc kubenswrapper[4795]: I0320 17:36:21.482006 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:21 crc kubenswrapper[4795]: I0320 17:36:21.488376 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-655xx"] Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.742027 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.916227 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") pod \"188f326f-74f0-423d-9ae1-54aae0c1474e\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.916338 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") pod \"188f326f-74f0-423d-9ae1-54aae0c1474e\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.916437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") pod \"188f326f-74f0-423d-9ae1-54aae0c1474e\" (UID: \"188f326f-74f0-423d-9ae1-54aae0c1474e\") " Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.943080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg" (OuterVolumeSpecName: "kube-api-access-jgldg") pod "188f326f-74f0-423d-9ae1-54aae0c1474e" (UID: "188f326f-74f0-423d-9ae1-54aae0c1474e"). InnerVolumeSpecName "kube-api-access-jgldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.966140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config" (OuterVolumeSpecName: "config") pod "188f326f-74f0-423d-9ae1-54aae0c1474e" (UID: "188f326f-74f0-423d-9ae1-54aae0c1474e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:22 crc kubenswrapper[4795]: I0320 17:36:22.971805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "188f326f-74f0-423d-9ae1-54aae0c1474e" (UID: "188f326f-74f0-423d-9ae1-54aae0c1474e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.018910 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.018954 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgldg\" (UniqueName: \"kubernetes.io/projected/188f326f-74f0-423d-9ae1-54aae0c1474e-kube-api-access-jgldg\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.018965 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/188f326f-74f0-423d-9ae1-54aae0c1474e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.261159 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9eecdd-6791-4b24-8855-6036767861cf" path="/var/lib/kubelet/pods/5d9eecdd-6791-4b24-8855-6036767861cf/volumes" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.457465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" event={"ID":"188f326f-74f0-423d-9ae1-54aae0c1474e","Type":"ContainerDied","Data":"eceba19972887264330468d5439623bcd97b3e95bfc235711a65beb5805973f3"} Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.457514 4795 scope.go:117] "RemoveContainer" containerID="87534ecfb167ccd1f65f7ca6c16ec74ef295ebd69a68982ce1f8e8d67186a8f5" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.457562 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.480008 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:36:23 crc kubenswrapper[4795]: I0320 17:36:23.487868 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-w4cr8"] Mar 20 17:36:25 crc kubenswrapper[4795]: I0320 17:36:25.265752 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" path="/var/lib/kubelet/pods/188f326f-74f0-423d-9ae1-54aae0c1474e/volumes" Mar 20 17:36:26 crc kubenswrapper[4795]: I0320 17:36:26.523207 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-w4cr8" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: i/o timeout" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.415783 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.416454 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:34 crc kubenswrapper[4795]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:36:34 crc kubenswrapper[4795]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:36:34 crc kubenswrapper[4795]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4h5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(b8103489-e552-49b0-a32a-1069a46feff9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:36:34 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.417767 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" Mar 20 17:36:34 crc kubenswrapper[4795]: I0320 17:36:34.537424 4795 scope.go:117] "RemoveContainer" containerID="bb90f0d50353c309b35e0e03c747d10c744c81a1b2cc09b5b5dec2672959655d" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.579190 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.630935 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.631390 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:36:34 crc kubenswrapper[4795]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:36:34 crc kubenswrapper[4795]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:36:34 crc kubenswrapper[4795]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:36:34 crc kubenswrapper[4795]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:36:34 crc kubenswrapper[4795]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6r2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(d3e6834b-7e74-46f8-a734-b473080c05d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:36:34 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 20 17:36:34 crc kubenswrapper[4795]: E0320 17:36:34.633552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.057266 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-n4gzx"] Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.135059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:35 crc kubenswrapper[4795]: W0320 17:36:35.145043 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a94015_8d98_4745_ab7a_74ebdd435638.slice/crio-ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96 WatchSource:0}: Error finding container ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96: Status 404 returned error can't find the container with id ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96 Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.562363 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a94015-8d98-4745-ab7a-74ebdd435638" containerID="c364cb833dc6a826503aaf9ccf6e20afbe333b743f6b2e28ddf8d915f44cb337" exitCode=0 Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.562417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerDied","Data":"c364cb833dc6a826503aaf9ccf6e20afbe333b743f6b2e28ddf8d915f44cb337"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.562647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerStarted","Data":"ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.565971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ca95ec62-fce9-4c91-bb59-fa80f512edba","Type":"ContainerStarted","Data":"df3795f1b066389261f49b06978897913a1ad9450900b8a1476362e278dd3477"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.566646 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.569047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n4gzx" event={"ID":"85004117-20bc-474e-88f5-ce49032749ff","Type":"ContainerStarted","Data":"137a2ea5670e7794f8668086fe831dd3103e9ff4f37fdb89e2c07bc0e196f6e1"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.570500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerStarted","Data":"be589416fed96b6271b6b5906e6f1d45a7a2e0530fadd8efe4c4d83d2ddd456b"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.572828 4795 generic.go:334] "Generic (PLEG): container finished" podID="38f88deb-b38d-4c52-a901-baeb9da08559" containerID="a9b37f38da5a02a709d41d5d8718cdfaffaae9f225a892b8a803fd7f9d1c5b9d" exitCode=0 Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.572875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" event={"ID":"38f88deb-b38d-4c52-a901-baeb9da08559","Type":"ContainerDied","Data":"a9b37f38da5a02a709d41d5d8718cdfaffaae9f225a892b8a803fd7f9d1c5b9d"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.576571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"7b2cecc174f5dcc6f6034c4b95f2a94c23f585710a36c097c1d013a8018c8d16"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.580353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerStarted","Data":"20d641e4b2c5af5a648fe4d600e82a2440ccb249de16db64b4e4b4b818f40e8b"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.587623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7","Type":"ContainerStarted","Data":"3c11b4ea967168fa5d174eecf8dab28998faefca6fdedab2fa357e8713bbe892"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.589610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g" event={"ID":"28df10bb-d6a9-47a9-9b79-0bb9665529ef","Type":"ContainerStarted","Data":"c670dee051c4c8511222332bcd79dd4320ea4d16c1fc4edf2416e15ce5d175dd"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.590779 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dnp2g" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.599597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b9d4ac2-2b66-441a-a6d4-0d467d857f99","Type":"ContainerStarted","Data":"5caf37cf4351516be5d864c1286eb4d8795b6278dae1f859a384a32c06688c9f"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.605507 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.139101628 podStartE2EDuration="35.605488271s" podCreationTimestamp="2026-03-20 17:36:00 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.019018332 +0000 UTC m=+1114.477049893" lastFinishedPulling="2026-03-20 17:36:21.485404975 +0000 UTC m=+1124.943436536" observedRunningTime="2026-03-20 17:36:35.602162918 +0000 UTC m=+1139.060194479" watchObservedRunningTime="2026-03-20 17:36:35.605488271 +0000 UTC m=+1139.063519812" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.606921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerStarted","Data":"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802"} Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.607042 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:36:35 crc kubenswrapper[4795]: E0320 17:36:35.610967 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.720604 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.032010815 podStartE2EDuration="33.720571573s" podCreationTimestamp="2026-03-20 17:36:02 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.978667025 +0000 UTC m=+1114.436698576" lastFinishedPulling="2026-03-20 17:36:34.667227793 +0000 UTC m=+1138.125259334" observedRunningTime="2026-03-20 17:36:35.714070622 +0000 UTC m=+1139.172102173" watchObservedRunningTime="2026-03-20 17:36:35.720571573 +0000 UTC m=+1139.178603135" Mar 20 17:36:35 crc kubenswrapper[4795]: I0320 17:36:35.736334 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnp2g" podStartSLOduration=16.861927349 podStartE2EDuration="29.736316894s" podCreationTimestamp="2026-03-20 17:36:06 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.998854464 +0000 UTC m=+1114.456886005" lastFinishedPulling="2026-03-20 17:36:23.873244009 +0000 UTC m=+1127.331275550" observedRunningTime="2026-03-20 17:36:35.730512053 +0000 UTC m=+1139.188543594" watchObservedRunningTime="2026-03-20 17:36:35.736316894 +0000 UTC m=+1139.194348435" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.618396 4795 generic.go:334] "Generic (PLEG): container finished" podID="5675bf5e-3a57-4082-8631-680ced6fb634" containerID="7b2cecc174f5dcc6f6034c4b95f2a94c23f585710a36c097c1d013a8018c8d16" exitCode=0 Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerDied","Data":"7b2cecc174f5dcc6f6034c4b95f2a94c23f585710a36c097c1d013a8018c8d16"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"f62525ebd7dd589cb5ed4eef766150f5dadbae6274446cd1ee837efe0dd7dc83"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dsqcc" event={"ID":"5675bf5e-3a57-4082-8631-680ced6fb634","Type":"ContainerStarted","Data":"ccb21af206808d8161b15ed779b3acf783c07dfbafc6494225d5a626c7d3ff23"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.619830 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.624339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerStarted","Data":"e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68"} Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.641189 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dsqcc" podStartSLOduration=19.879408314 podStartE2EDuration="30.641174022s" podCreationTimestamp="2026-03-20 17:36:06 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.93799878 +0000 UTC m=+1115.396030321" lastFinishedPulling="2026-03-20 17:36:22.699764488 +0000 UTC m=+1126.157796029" observedRunningTime="2026-03-20 17:36:36.636866618 +0000 UTC m=+1140.094898159" watchObservedRunningTime="2026-03-20 17:36:36.641174022 +0000 UTC m=+1140.099205563" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.668925 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" podStartSLOduration=17.668882626 podStartE2EDuration="17.668882626s" podCreationTimestamp="2026-03-20 17:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:36.660006699 +0000 UTC m=+1140.118038240" watchObservedRunningTime="2026-03-20 17:36:36.668882626 +0000 UTC m=+1140.126914167" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.904221 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.985704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") pod \"38f88deb-b38d-4c52-a901-baeb9da08559\" (UID: \"38f88deb-b38d-4c52-a901-baeb9da08559\") " Mar 20 17:36:36 crc kubenswrapper[4795]: I0320 17:36:36.995963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk" (OuterVolumeSpecName: "kube-api-access-28trk") pod "38f88deb-b38d-4c52-a901-baeb9da08559" (UID: "38f88deb-b38d-4c52-a901-baeb9da08559"). InnerVolumeSpecName "kube-api-access-28trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.087804 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28trk\" (UniqueName: \"kubernetes.io/projected/38f88deb-b38d-4c52-a901-baeb9da08559-kube-api-access-28trk\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.634096 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.634276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-j4mtv" event={"ID":"38f88deb-b38d-4c52-a901-baeb9da08559","Type":"ContainerDied","Data":"d53c776ab6a465f3a075842cf379886815e4d1caf766e21f6fcc37ed80564b8a"} Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.634788 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53c776ab6a465f3a075842cf379886815e4d1caf766e21f6fcc37ed80564b8a" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.635424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.966273 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:36:37 crc kubenswrapper[4795]: I0320 17:36:37.973494 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-kh5md"] Mar 20 17:36:38 crc kubenswrapper[4795]: I0320 17:36:38.644777 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f5a24ef-fc80-4386-9f81-5f21154223f3" containerID="20d641e4b2c5af5a648fe4d600e82a2440ccb249de16db64b4e4b4b818f40e8b" exitCode=0 Mar 20 17:36:38 crc kubenswrapper[4795]: I0320 17:36:38.645821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerDied","Data":"20d641e4b2c5af5a648fe4d600e82a2440ccb249de16db64b4e4b4b818f40e8b"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.263051 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93986a1-82a8-4eac-ba5e-f790196b25ce" path="/var/lib/kubelet/pods/f93986a1-82a8-4eac-ba5e-f790196b25ce/volumes" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.656120 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987" containerID="be589416fed96b6271b6b5906e6f1d45a7a2e0530fadd8efe4c4d83d2ddd456b" exitCode=0 Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.656251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerDied","Data":"be589416fed96b6271b6b5906e6f1d45a7a2e0530fadd8efe4c4d83d2ddd456b"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.660164 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c07f346e-3e6c-41a5-bdda-67a4a5f04ba7","Type":"ContainerStarted","Data":"42c51c488fe4bfac9ef63685016e66545b68d399c83290387b1757181c2dd716"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.664359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3b9d4ac2-2b66-441a-a6d4-0d467d857f99","Type":"ContainerStarted","Data":"a7b67360e8b08288f00db3af0c6989e32f69622da33969cfda3444f96aee94dc"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.667394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-n4gzx" event={"ID":"85004117-20bc-474e-88f5-ce49032749ff","Type":"ContainerStarted","Data":"aa860c8d92a2db46347d66290b667c4d2b2232b38a2fb358259720ec2599f882"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.672528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f5a24ef-fc80-4386-9f81-5f21154223f3","Type":"ContainerStarted","Data":"44d126d474f5c38d05ae811af673fb00ab0fc770694ad6081494c7edc888a203"} Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.730022 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=31.547693068 podStartE2EDuration="42.730000169s" podCreationTimestamp="2026-03-20 17:35:57 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.802632585 +0000 UTC m=+1114.260664126" lastFinishedPulling="2026-03-20 17:36:21.984939686 +0000 UTC m=+1125.442971227" observedRunningTime="2026-03-20 17:36:39.721799624 +0000 UTC m=+1143.179831205" watchObservedRunningTime="2026-03-20 17:36:39.730000169 +0000 UTC m=+1143.188031730" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.764306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.12783376 podStartE2EDuration="30.764286467s" podCreationTimestamp="2026-03-20 17:36:09 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.203473264 +0000 UTC m=+1114.661504815" lastFinishedPulling="2026-03-20 17:36:38.839925981 +0000 UTC m=+1142.297957522" observedRunningTime="2026-03-20 17:36:39.750931171 +0000 UTC m=+1143.208962722" watchObservedRunningTime="2026-03-20 17:36:39.764286467 +0000 UTC m=+1143.222318018" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.810032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.114965009 podStartE2EDuration="31.81001274s" podCreationTimestamp="2026-03-20 17:36:08 +0000 UTC" firstStartedPulling="2026-03-20 17:36:11.104805242 +0000 UTC m=+1114.562836783" lastFinishedPulling="2026-03-20 17:36:38.799852973 +0000 UTC m=+1142.257884514" observedRunningTime="2026-03-20 17:36:39.779233842 +0000 UTC m=+1143.237265383" watchObservedRunningTime="2026-03-20 17:36:39.81001274 +0000 UTC m=+1143.268044291" Mar 20 17:36:39 crc kubenswrapper[4795]: I0320 17:36:39.839294 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-n4gzx" podStartSLOduration=17.05935214 podStartE2EDuration="20.839274471s" podCreationTimestamp="2026-03-20 17:36:19 +0000 UTC" firstStartedPulling="2026-03-20 17:36:35.087589418 +0000 UTC m=+1138.545620959" lastFinishedPulling="2026-03-20 17:36:38.867511749 +0000 UTC m=+1142.325543290" observedRunningTime="2026-03-20 17:36:39.833925824 +0000 UTC m=+1143.291957365" watchObservedRunningTime="2026-03-20 17:36:39.839274471 +0000 UTC m=+1143.297306012" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.199603 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.199647 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.253865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.370142 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.370557 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.419450 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.683300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987","Type":"ContainerStarted","Data":"e9df1e53f87c68e30f263f17ef84197934d01922d65cbffbe45f43cf8a087e5d"} Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.715909 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.661704817 podStartE2EDuration="41.71588792s" podCreationTimestamp="2026-03-20 17:35:59 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.812322727 +0000 UTC m=+1114.270354268" lastFinishedPulling="2026-03-20 17:36:23.86650583 +0000 UTC m=+1127.324537371" observedRunningTime="2026-03-20 17:36:40.711198055 +0000 UTC m=+1144.169229616" watchObservedRunningTime="2026-03-20 17:36:40.71588792 +0000 UTC m=+1144.173919461" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.733970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 17:36:40 crc kubenswrapper[4795]: I0320 17:36:40.748465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.020968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:36:41 crc kubenswrapper[4795]: E0320 17:36:41.021299 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" containerName="oc" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021318 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" containerName="oc" Mar 20 17:36:41 crc kubenswrapper[4795]: E0320 17:36:41.021353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="init" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021362 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="init" Mar 20 17:36:41 crc kubenswrapper[4795]: E0320 17:36:41.021393 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021402 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021571 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="188f326f-74f0-423d-9ae1-54aae0c1474e" containerName="dnsmasq-dns" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.021587 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" containerName="oc" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.022512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.024776 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.024926 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7bwm5" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.026388 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.027658 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.043211 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.078461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-config\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brh4g\" (UniqueName: \"kubernetes.io/projected/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-kube-api-access-brh4g\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.082978 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-scripts\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.083078 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.083156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.101927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185163 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brh4g\" (UniqueName: \"kubernetes.io/projected/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-kube-api-access-brh4g\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-scripts\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185435 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-config\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185500 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.185785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.186350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-scripts\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.186485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-config\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.190150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.213424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.214608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.222468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brh4g\" (UniqueName: \"kubernetes.io/projected/6cfc9397-7268-4bd1-8bbf-d107e94ab35a-kube-api-access-brh4g\") pod \"ovn-northd-0\" (UID: \"6cfc9397-7268-4bd1-8bbf-d107e94ab35a\") " pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.358364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:36:41 crc kubenswrapper[4795]: I0320 17:36:41.825520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:36:41 crc kubenswrapper[4795]: W0320 17:36:41.830710 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cfc9397_7268_4bd1_8bbf_d107e94ab35a.slice/crio-1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838 WatchSource:0}: Error finding container 1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838: Status 404 returned error can't find the container with id 1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838 Mar 20 17:36:42 crc kubenswrapper[4795]: I0320 17:36:42.696862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6cfc9397-7268-4bd1-8bbf-d107e94ab35a","Type":"ContainerStarted","Data":"1518a3cecdc9e3a56b40f7254b126c619f179fe07f9eb33c86baa277e9bef838"} Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.205909 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.206847 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" containerID="cri-o://e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68" gracePeriod=10 Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.207023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.208145 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.236315 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.243133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.277473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323018 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323176 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.323236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: E0320 17:36:43.403181 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a94015_8d98_4745_ab7a_74ebdd435638.slice/crio-e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a94015_8d98_4745_ab7a_74ebdd435638.slice/crio-conmon-e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.424618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.425832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.426142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.426876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.446823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"dnsmasq-dns-698758b865-8grln\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.610702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.708659 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a94015-8d98-4745-ab7a-74ebdd435638" containerID="e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68" exitCode=0 Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.708715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerDied","Data":"e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68"} Mar 20 17:36:43 crc kubenswrapper[4795]: I0320 17:36:43.901334 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.034929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") pod \"32a94015-8d98-4745-ab7a-74ebdd435638\" (UID: \"32a94015-8d98-4745-ab7a-74ebdd435638\") " Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.042024 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699" (OuterVolumeSpecName: "kube-api-access-gm699") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "kube-api-access-gm699". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.100149 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config" (OuterVolumeSpecName: "config") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.102420 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.102545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.103981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32a94015-8d98-4745-ab7a-74ebdd435638" (UID: "32a94015-8d98-4745-ab7a-74ebdd435638"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136221 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm699\" (UniqueName: \"kubernetes.io/projected/32a94015-8d98-4745-ab7a-74ebdd435638-kube-api-access-gm699\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136249 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136257 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136267 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.136277 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a94015-8d98-4745-ab7a-74ebdd435638-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.264260 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:36:44 crc kubenswrapper[4795]: W0320 17:36:44.264546 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod601af69d_c03f_4bdf_b3bf_67ba791674f9.slice/crio-f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6 WatchSource:0}: Error finding container f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6: Status 404 returned error can't find the container with id f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6 Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.365479 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.365844 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.365859 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.365876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="init" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.365883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="init" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.366035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" containerName="dnsmasq-dns" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.394581 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.397259 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dvvhq" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.397626 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.397672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.398565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.415062 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4vs\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-kube-api-access-xx4vs\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e955e5-ba7a-4582-9d52-40333fe21b7f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-cache\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440627 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-lock\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.440650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-cache\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-lock\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4vs\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-kube-api-access-xx4vs\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.542964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e955e5-ba7a-4582-9d52-40333fe21b7f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543073 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-cache\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.543249 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.543268 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:44 crc kubenswrapper[4795]: E0320 17:36:44.543322 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:45.043303789 +0000 UTC m=+1148.501335330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543334 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.543758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/88e955e5-ba7a-4582-9d52-40333fe21b7f-lock\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.547578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e955e5-ba7a-4582-9d52-40333fe21b7f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.561725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4vs\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-kube-api-access-xx4vs\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.568395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.717940 4795 generic.go:334] "Generic (PLEG): container finished" podID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerID="a43865e4251904d08d5c0655d2fd65e83c2843454a9cfbf7734d7aa91dad11f3" exitCode=0 Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.718006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerDied","Data":"a43865e4251904d08d5c0655d2fd65e83c2843454a9cfbf7734d7aa91dad11f3"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.718100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerStarted","Data":"f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.721344 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6cfc9397-7268-4bd1-8bbf-d107e94ab35a","Type":"ContainerStarted","Data":"3885d095895d5da5a1b34a80367bb40ec5b285efc2098d1ba8e7cbc11155364b"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.721387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6cfc9397-7268-4bd1-8bbf-d107e94ab35a","Type":"ContainerStarted","Data":"246986c0d6bbfde2470a9586085454fa6243ebcee0f179784cf0c7047eaf3732"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.721526 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.723835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" event={"ID":"32a94015-8d98-4745-ab7a-74ebdd435638","Type":"ContainerDied","Data":"ee4e28445819c720fb9a1aff178d5135783aa18871dab803797b32cd52667c96"} Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.723875 4795 scope.go:117] "RemoveContainer" containerID="e66bb8741b8637dd4f9608f0576cd56ac309d19839bb6101a8574b9b83b98c68" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.724049 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-g9nkw" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.764603 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.704862109 podStartE2EDuration="4.764585498s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="2026-03-20 17:36:41.833131411 +0000 UTC m=+1145.291162952" lastFinishedPulling="2026-03-20 17:36:43.89285479 +0000 UTC m=+1147.350886341" observedRunningTime="2026-03-20 17:36:44.762425321 +0000 UTC m=+1148.220456872" watchObservedRunningTime="2026-03-20 17:36:44.764585498 +0000 UTC m=+1148.222617049" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.835723 4795 scope.go:117] "RemoveContainer" containerID="c364cb833dc6a826503aaf9ccf6e20afbe333b743f6b2e28ddf8d915f44cb337" Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.861817 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:44 crc kubenswrapper[4795]: I0320 17:36:44.868985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-g9nkw"] Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.051025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:45 crc kubenswrapper[4795]: E0320 17:36:45.051248 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:45 crc kubenswrapper[4795]: E0320 17:36:45.051267 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:45 crc kubenswrapper[4795]: E0320 17:36:45.051325 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:46.051305893 +0000 UTC m=+1149.509337434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.261363 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a94015-8d98-4745-ab7a-74ebdd435638" path="/var/lib/kubelet/pods/32a94015-8d98-4745-ab7a-74ebdd435638/volumes" Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.736793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerStarted","Data":"7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc"} Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.736956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:45 crc kubenswrapper[4795]: I0320 17:36:45.768214 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-8grln" podStartSLOduration=2.76819001 podStartE2EDuration="2.76819001s" podCreationTimestamp="2026-03-20 17:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:45.756001191 +0000 UTC m=+1149.214032782" watchObservedRunningTime="2026-03-20 17:36:45.76819001 +0000 UTC m=+1149.226221591" Mar 20 17:36:46 crc kubenswrapper[4795]: I0320 17:36:46.064978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:46 crc kubenswrapper[4795]: E0320 17:36:46.065239 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:46 crc kubenswrapper[4795]: E0320 17:36:46.065277 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:46 crc kubenswrapper[4795]: E0320 17:36:46.065352 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:48.06533119 +0000 UTC m=+1151.523362741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.102809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:48 crc kubenswrapper[4795]: E0320 17:36:48.102974 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: E0320 17:36:48.103754 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: E0320 17:36:48.103832 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:36:52.103807389 +0000 UTC m=+1155.561838960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.283473 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m8zw5"] Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.285539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.288366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.289023 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.290620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m8zw5"] Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.306861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.407638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.408653 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.409134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.409139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.414257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.414373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.414437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.431286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"swift-ring-rebalance-m8zw5\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:48 crc kubenswrapper[4795]: I0320 17:36:48.613597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.192663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m8zw5"] Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.243786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.244340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.336159 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.659513 4795 scope.go:117] "RemoveContainer" containerID="56b4e175842a208b79b6d416a354b0c057585a391bd973a1b6ce26b23a0cd738" Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.771954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerStarted","Data":"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157"} Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.773713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerStarted","Data":"97fb6267b98d2b148faab38e0f46037ac0fdd70749948e0fa5391492bff624c1"} Mar 20 17:36:49 crc kubenswrapper[4795]: I0320 17:36:49.880657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.631983 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.632042 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.716130 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.789221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerStarted","Data":"5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea"} Mar 20 17:36:50 crc kubenswrapper[4795]: I0320 17:36:50.897289 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.306194 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.307526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.309563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.324858 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.343892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.348626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.356939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.462621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.564276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.565440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.566105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.585665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"glance-db-create-8d96q\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " pod="openstack/glance-db-create-8d96q" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.585671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"glance-30ae-account-create-update-d79gp\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.627424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:51 crc kubenswrapper[4795]: I0320 17:36:51.672432 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.012818 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.014029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.027603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.081006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.081249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.145381 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.146576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.148359 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.158087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.183296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.183364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: E0320 17:36:52.183407 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:36:52 crc kubenswrapper[4795]: E0320 17:36:52.183423 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.183438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: E0320 17:36:52.183463 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:37:00.183447501 +0000 UTC m=+1163.641479042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.184045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.202052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"keystone-db-create-ncfp9\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.244058 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.245097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.260400 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.285384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.285522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.348992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.353186 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.366058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.366168 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.369083 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390608 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390762 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.390786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.393419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.406433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"keystone-1681-account-create-update-vpwb2\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.462768 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.492278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.493370 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.513594 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"placement-db-create-c5rg6\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.564013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.594155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.594258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.595136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.610486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"placement-a409-account-create-update-zvscf\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:52 crc kubenswrapper[4795]: I0320 17:36:52.698574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.614928 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.618213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:36:53 crc kubenswrapper[4795]: W0320 17:36:53.620102 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aac28d5_6b58_424e_83f8_ec71c53e41ce.slice/crio-690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c WatchSource:0}: Error finding container 690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c: Status 404 returned error can't find the container with id 690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.634010 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.639667 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.679094 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.679285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" containerID="cri-o://cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" gracePeriod=10 Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.772295 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:36:53 crc kubenswrapper[4795]: W0320 17:36:53.782941 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9265b8c_0b80_47d9_8f4b_3d996233341e.slice/crio-e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876 WatchSource:0}: Error finding container e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876: Status 404 returned error can't find the container with id e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876 Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.791818 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.802388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.817645 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8d96q" event={"ID":"389c1f10-5aba-4c4d-b0b3-3a38f6038536","Type":"ContainerStarted","Data":"10d072b8e1df887899cd7e286a697237d05c31ae70d12b646ba13bce102cacf2"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.824643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a409-account-create-update-zvscf" event={"ID":"acfb1ea8-a8d2-4152-ad18-54d380b289c4","Type":"ContainerStarted","Data":"74a7613978faa2356589d41700c5d796593e713d0f9d37d06f172fe2c54e0f4d"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.825901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerStarted","Data":"5d7bc73635aabb988bf8536345a728b4481d0d64844c13a801df663371346492"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.828582 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30ae-account-create-update-d79gp" event={"ID":"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87","Type":"ContainerStarted","Data":"9819bc13c4d025d57bc13c848c09e31cf44875556c9e7737142ef481557663b2"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.836202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1681-account-create-update-vpwb2" event={"ID":"c9265b8c-0b80-47d9-8f4b-3d996233341e","Type":"ContainerStarted","Data":"e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.837585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ncfp9" event={"ID":"6aac28d5-6b58-424e-83f8-ec71c53e41ce","Type":"ContainerStarted","Data":"690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c"} Mar 20 17:36:53 crc kubenswrapper[4795]: I0320 17:36:53.865648 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m8zw5" podStartSLOduration=1.883358547 podStartE2EDuration="5.865622577s" podCreationTimestamp="2026-03-20 17:36:48 +0000 UTC" firstStartedPulling="2026-03-20 17:36:49.195614708 +0000 UTC m=+1152.653646239" lastFinishedPulling="2026-03-20 17:36:53.177878728 +0000 UTC m=+1156.635910269" observedRunningTime="2026-03-20 17:36:53.845862442 +0000 UTC m=+1157.303893983" watchObservedRunningTime="2026-03-20 17:36:53.865622577 +0000 UTC m=+1157.323654118" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.169317 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.259390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") pod \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.259473 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") pod \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.259648 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") pod \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\" (UID: \"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2\") " Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.275636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs" (OuterVolumeSpecName: "kube-api-access-ccwzs") pod "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" (UID: "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2"). InnerVolumeSpecName "kube-api-access-ccwzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.302096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config" (OuterVolumeSpecName: "config") pod "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" (UID: "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.309289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" (UID: "ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.361573 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwzs\" (UniqueName: \"kubernetes.io/projected/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-kube-api-access-ccwzs\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.361616 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.361629 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845278 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845349 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerDied","Data":"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gg8p4" event={"ID":"ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2","Type":"ContainerDied","Data":"a6075430a5a7a42dcee3b92020556eee6821261a85421f6ff8cc34985b56804c"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.845410 4795 scope.go:117] "RemoveContainer" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.846920 4795 generic.go:334] "Generic (PLEG): container finished" podID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerID="1fe0f6a7ba267ec0588c7d4179b78569ace69ae42af4f4ce02a9e28bfc87aa93" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.846996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a409-account-create-update-zvscf" event={"ID":"acfb1ea8-a8d2-4152-ad18-54d380b289c4","Type":"ContainerDied","Data":"1fe0f6a7ba267ec0588c7d4179b78569ace69ae42af4f4ce02a9e28bfc87aa93"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.848210 4795 generic.go:334] "Generic (PLEG): container finished" podID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerID="115e8aa5e635a588311da0792150e7730feaab865eb0acb01117eb70b42bfde3" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.848281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c5rg6" event={"ID":"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc","Type":"ContainerDied","Data":"115e8aa5e635a588311da0792150e7730feaab865eb0acb01117eb70b42bfde3"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.848300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c5rg6" event={"ID":"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc","Type":"ContainerStarted","Data":"a52e3060dd33f381cd140ac8936ebc19848c276f6cba28e6c942f6eae0bfa041"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.850348 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerID="fa7f816c765d44ed743198c38348dd663b04f7cfc3b7f6aac5dffa2623d4db45" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.850396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30ae-account-create-update-d79gp" event={"ID":"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87","Type":"ContainerDied","Data":"fa7f816c765d44ed743198c38348dd663b04f7cfc3b7f6aac5dffa2623d4db45"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.852179 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerID="c288b4ff895d130555ade7ce513d591493310d7fb3678ee47968d204fe11297a" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.852245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1681-account-create-update-vpwb2" event={"ID":"c9265b8c-0b80-47d9-8f4b-3d996233341e","Type":"ContainerDied","Data":"c288b4ff895d130555ade7ce513d591493310d7fb3678ee47968d204fe11297a"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.853958 4795 generic.go:334] "Generic (PLEG): container finished" podID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerID="2d63356a6d0232331bb76203b1359e46e0f2a21a5ebc5f3160865388f8cf9a1c" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.853996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ncfp9" event={"ID":"6aac28d5-6b58-424e-83f8-ec71c53e41ce","Type":"ContainerDied","Data":"2d63356a6d0232331bb76203b1359e46e0f2a21a5ebc5f3160865388f8cf9a1c"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.855332 4795 generic.go:334] "Generic (PLEG): container finished" podID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerID="61c0a7747547de21c917366527d52306c78b302a64918960d0e832416be0ca0f" exitCode=0 Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.855426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8d96q" event={"ID":"389c1f10-5aba-4c4d-b0b3-3a38f6038536","Type":"ContainerDied","Data":"61c0a7747547de21c917366527d52306c78b302a64918960d0e832416be0ca0f"} Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.871489 4795 scope.go:117] "RemoveContainer" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.914254 4795 scope.go:117] "RemoveContainer" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" Mar 20 17:36:54 crc kubenswrapper[4795]: E0320 17:36:54.914780 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe\": container with ID starting with cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe not found: ID does not exist" containerID="cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.914836 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe"} err="failed to get container status \"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe\": rpc error: code = NotFound desc = could not find container \"cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe\": container with ID starting with cbf472505256faee617dc03960955d61704d7dc5e754fbc924e90804ce8018fe not found: ID does not exist" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.914870 4795 scope.go:117] "RemoveContainer" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" Mar 20 17:36:54 crc kubenswrapper[4795]: E0320 17:36:54.915264 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589\": container with ID starting with dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589 not found: ID does not exist" containerID="dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.915309 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589"} err="failed to get container status \"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589\": rpc error: code = NotFound desc = could not find container \"dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589\": container with ID starting with dd9ce211ff0cb422e02fb227f4ac6e73309a9d533d98625386e7cd4db46a1589 not found: ID does not exist" Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.987226 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:36:54 crc kubenswrapper[4795]: I0320 17:36:54.992321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gg8p4"] Mar 20 17:36:55 crc kubenswrapper[4795]: I0320 17:36:55.265397 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" path="/var/lib/kubelet/pods/ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2/volumes" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.174101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.298724 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.304162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") pod \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.304278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") pod \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\" (UID: \"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.305077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" (UID: "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.316884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82" (OuterVolumeSpecName: "kube-api-access-rnn82") pod "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" (UID: "1c13d27e-7afd-4113-9b7d-fcbf7eb87c87"). InnerVolumeSpecName "kube-api-access-rnn82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409177 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") pod \"c9265b8c-0b80-47d9-8f4b-3d996233341e\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409306 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") pod \"c9265b8c-0b80-47d9-8f4b-3d996233341e\" (UID: \"c9265b8c-0b80-47d9-8f4b-3d996233341e\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409776 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.409793 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnn82\" (UniqueName: \"kubernetes.io/projected/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87-kube-api-access-rnn82\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.413567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9265b8c-0b80-47d9-8f4b-3d996233341e" (UID: "c9265b8c-0b80-47d9-8f4b-3d996233341e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.414230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4" (OuterVolumeSpecName: "kube-api-access-hppz4") pod "c9265b8c-0b80-47d9-8f4b-3d996233341e" (UID: "c9265b8c-0b80-47d9-8f4b-3d996233341e"). InnerVolumeSpecName "kube-api-access-hppz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.457574 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.493818 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.495758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.510239 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.511700 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hppz4\" (UniqueName: \"kubernetes.io/projected/c9265b8c-0b80-47d9-8f4b-3d996233341e-kube-api-access-hppz4\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.511723 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9265b8c-0b80-47d9-8f4b-3d996233341e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") pod \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") pod \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") pod \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\" (UID: \"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") pod \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\" (UID: \"acfb1ea8-a8d2-4152-ad18-54d380b289c4\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") pod \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") pod \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") pod \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\" (UID: \"6aac28d5-6b58-424e-83f8-ec71c53e41ce\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.613584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") pod \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\" (UID: \"389c1f10-5aba-4c4d-b0b3-3a38f6038536\") " Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.615851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "acfb1ea8-a8d2-4152-ad18-54d380b289c4" (UID: "acfb1ea8-a8d2-4152-ad18-54d380b289c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.617409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6aac28d5-6b58-424e-83f8-ec71c53e41ce" (UID: "6aac28d5-6b58-424e-83f8-ec71c53e41ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.617701 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "389c1f10-5aba-4c4d-b0b3-3a38f6038536" (UID: "389c1f10-5aba-4c4d-b0b3-3a38f6038536"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.617854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" (UID: "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.618864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n" (OuterVolumeSpecName: "kube-api-access-wg59n") pod "389c1f10-5aba-4c4d-b0b3-3a38f6038536" (UID: "389c1f10-5aba-4c4d-b0b3-3a38f6038536"). InnerVolumeSpecName "kube-api-access-wg59n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.619356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58" (OuterVolumeSpecName: "kube-api-access-rbd58") pod "6aac28d5-6b58-424e-83f8-ec71c53e41ce" (UID: "6aac28d5-6b58-424e-83f8-ec71c53e41ce"). InnerVolumeSpecName "kube-api-access-rbd58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.619441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m" (OuterVolumeSpecName: "kube-api-access-88z4m") pod "acfb1ea8-a8d2-4152-ad18-54d380b289c4" (UID: "acfb1ea8-a8d2-4152-ad18-54d380b289c4"). InnerVolumeSpecName "kube-api-access-88z4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.620544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb" (OuterVolumeSpecName: "kube-api-access-78bqb") pod "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" (UID: "f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc"). InnerVolumeSpecName "kube-api-access-78bqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715764 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78bqb\" (UniqueName: \"kubernetes.io/projected/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-kube-api-access-78bqb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715805 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715819 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acfb1ea8-a8d2-4152-ad18-54d380b289c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715833 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/389c1f10-5aba-4c4d-b0b3-3a38f6038536-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715845 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aac28d5-6b58-424e-83f8-ec71c53e41ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715856 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbd58\" (UniqueName: \"kubernetes.io/projected/6aac28d5-6b58-424e-83f8-ec71c53e41ce-kube-api-access-rbd58\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715868 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg59n\" (UniqueName: \"kubernetes.io/projected/389c1f10-5aba-4c4d-b0b3-3a38f6038536-kube-api-access-wg59n\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.715881 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88z4m\" (UniqueName: \"kubernetes.io/projected/acfb1ea8-a8d2-4152-ad18-54d380b289c4-kube-api-access-88z4m\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.886609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a409-account-create-update-zvscf" event={"ID":"acfb1ea8-a8d2-4152-ad18-54d380b289c4","Type":"ContainerDied","Data":"74a7613978faa2356589d41700c5d796593e713d0f9d37d06f172fe2c54e0f4d"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.886660 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74a7613978faa2356589d41700c5d796593e713d0f9d37d06f172fe2c54e0f4d" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.886748 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a409-account-create-update-zvscf" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.892296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c5rg6" event={"ID":"f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc","Type":"ContainerDied","Data":"a52e3060dd33f381cd140ac8936ebc19848c276f6cba28e6c942f6eae0bfa041"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.892389 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52e3060dd33f381cd140ac8936ebc19848c276f6cba28e6c942f6eae0bfa041" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.892491 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c5rg6" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.895145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-30ae-account-create-update-d79gp" event={"ID":"1c13d27e-7afd-4113-9b7d-fcbf7eb87c87","Type":"ContainerDied","Data":"9819bc13c4d025d57bc13c848c09e31cf44875556c9e7737142ef481557663b2"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.895180 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9819bc13c4d025d57bc13c848c09e31cf44875556c9e7737142ef481557663b2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.895240 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-30ae-account-create-update-d79gp" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.897654 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1681-account-create-update-vpwb2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.897640 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1681-account-create-update-vpwb2" event={"ID":"c9265b8c-0b80-47d9-8f4b-3d996233341e","Type":"ContainerDied","Data":"e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.898034 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e121bfead8c4d632f71717b51f5308e6324b91dd783fc02847d0f83f9f165876" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.902957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ncfp9" event={"ID":"6aac28d5-6b58-424e-83f8-ec71c53e41ce","Type":"ContainerDied","Data":"690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.903006 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690cecc3e162df483b1ce7bd20d4b127657763194174cda01d6952705444814c" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.903102 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ncfp9" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.910709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8d96q" event={"ID":"389c1f10-5aba-4c4d-b0b3-3a38f6038536","Type":"ContainerDied","Data":"10d072b8e1df887899cd7e286a697237d05c31ae70d12b646ba13bce102cacf2"} Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.910740 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10d072b8e1df887899cd7e286a697237d05c31ae70d12b646ba13bce102cacf2" Mar 20 17:36:56 crc kubenswrapper[4795]: I0320 17:36:56.910824 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8d96q" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.870534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871345 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871378 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871404 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="init" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871414 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="init" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871427 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871436 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871451 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871480 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871491 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871499 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871514 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871522 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" Mar 20 17:36:57 crc kubenswrapper[4795]: E0320 17:36:57.871566 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871575 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871805 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871821 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871857 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871873 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871890 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceab01d4-b57b-48c8-90e2-ae6cb6edc7e2" containerName="dnsmasq-dns" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" containerName="mariadb-database-create" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.871944 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" containerName="mariadb-account-create-update" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.872826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.876845 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 17:36:57 crc kubenswrapper[4795]: I0320 17:36:57.886750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.041354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.041440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.142734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.142908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.143611 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.185032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"root-account-create-update-c74j8\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.204916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.644893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:36:58 crc kubenswrapper[4795]: I0320 17:36:58.940358 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c74j8" event={"ID":"135e322f-177c-4bbb-bb3d-0ab19eba6f92","Type":"ContainerStarted","Data":"7997aceefc0685263d40f8dcf307e7e1ba659185b7b47b791ff788b49578f153"} Mar 20 17:36:59 crc kubenswrapper[4795]: I0320 17:36:59.957702 4795 generic.go:334] "Generic (PLEG): container finished" podID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerID="64da53ed17d6e7c8ed644863f568fc0f6e5e946972ad8fd66ba6db39c157b1e6" exitCode=0 Mar 20 17:36:59 crc kubenswrapper[4795]: I0320 17:36:59.957763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c74j8" event={"ID":"135e322f-177c-4bbb-bb3d-0ab19eba6f92","Type":"ContainerDied","Data":"64da53ed17d6e7c8ed644863f568fc0f6e5e946972ad8fd66ba6db39c157b1e6"} Mar 20 17:37:00 crc kubenswrapper[4795]: I0320 17:37:00.281191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:37:00 crc kubenswrapper[4795]: E0320 17:37:00.281426 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:37:00 crc kubenswrapper[4795]: E0320 17:37:00.281456 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:37:00 crc kubenswrapper[4795]: E0320 17:37:00.281515 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift podName:88e955e5-ba7a-4582-9d52-40333fe21b7f nodeName:}" failed. No retries permitted until 2026-03-20 17:37:16.281494646 +0000 UTC m=+1179.739526187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift") pod "swift-storage-0" (UID: "88e955e5-ba7a-4582-9d52-40333fe21b7f") : configmap "swift-ring-files" not found Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.314238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.408896 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") pod \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.408986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") pod \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\" (UID: \"135e322f-177c-4bbb-bb3d-0ab19eba6f92\") " Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.409860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "135e322f-177c-4bbb-bb3d-0ab19eba6f92" (UID: "135e322f-177c-4bbb-bb3d-0ab19eba6f92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.417164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn" (OuterVolumeSpecName: "kube-api-access-j4wxn") pod "135e322f-177c-4bbb-bb3d-0ab19eba6f92" (UID: "135e322f-177c-4bbb-bb3d-0ab19eba6f92"). InnerVolumeSpecName "kube-api-access-j4wxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.458031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.511176 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4wxn\" (UniqueName: \"kubernetes.io/projected/135e322f-177c-4bbb-bb3d-0ab19eba6f92-kube-api-access-j4wxn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.511215 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/135e322f-177c-4bbb-bb3d-0ab19eba6f92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.636230 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:37:01 crc kubenswrapper[4795]: E0320 17:37:01.636603 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerName="mariadb-account-create-update" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.636623 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerName="mariadb-account-create-update" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.636833 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" containerName="mariadb-account-create-update" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.637470 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.640657 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s6lrt" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.649382 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.650995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.816858 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.816907 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.816959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.817156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.919298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.925171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.925616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.928559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.940814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"glance-db-sync-sv5fz\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.951800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.976124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c74j8" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.976123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c74j8" event={"ID":"135e322f-177c-4bbb-bb3d-0ab19eba6f92","Type":"ContainerDied","Data":"7997aceefc0685263d40f8dcf307e7e1ba659185b7b47b791ff788b49578f153"} Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.976294 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7997aceefc0685263d40f8dcf307e7e1ba659185b7b47b791ff788b49578f153" Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.985859 4795 generic.go:334] "Generic (PLEG): container finished" podID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerID="5d7bc73635aabb988bf8536345a728b4481d0d64844c13a801df663371346492" exitCode=0 Mar 20 17:37:01 crc kubenswrapper[4795]: I0320 17:37:01.985903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerDied","Data":"5d7bc73635aabb988bf8536345a728b4481d0d64844c13a801df663371346492"} Mar 20 17:37:02 crc kubenswrapper[4795]: I0320 17:37:02.519452 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:37:02 crc kubenswrapper[4795]: W0320 17:37:02.520422 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode951c331_872c_41b6_b747_d5129b8c0a1b.slice/crio-846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e WatchSource:0}: Error finding container 846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e: Status 404 returned error can't find the container with id 846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e Mar 20 17:37:02 crc kubenswrapper[4795]: I0320 17:37:02.992658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerStarted","Data":"846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e"} Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.312011 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441667 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.441791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") pod \"2c422574-0103-4c97-9e23-5a78c5b44e69\" (UID: \"2c422574-0103-4c97-9e23-5a78c5b44e69\") " Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.442904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.443148 4795 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.443714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.453265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz" (OuterVolumeSpecName: "kube-api-access-lxsvz") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "kube-api-access-lxsvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.458555 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.464307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts" (OuterVolumeSpecName: "scripts") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.469303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.470943 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c422574-0103-4c97-9e23-5a78c5b44e69" (UID: "2c422574-0103-4c97-9e23-5a78c5b44e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545456 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsvz\" (UniqueName: \"kubernetes.io/projected/2c422574-0103-4c97-9e23-5a78c5b44e69-kube-api-access-lxsvz\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545498 4795 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545509 4795 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545522 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c422574-0103-4c97-9e23-5a78c5b44e69-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545535 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c422574-0103-4c97-9e23-5a78c5b44e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:03 crc kubenswrapper[4795]: I0320 17:37:03.545547 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2c422574-0103-4c97-9e23-5a78c5b44e69-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.002104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m8zw5" event={"ID":"2c422574-0103-4c97-9e23-5a78c5b44e69","Type":"ContainerDied","Data":"97fb6267b98d2b148faab38e0f46037ac0fdd70749948e0fa5391492bff624c1"} Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.002161 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fb6267b98d2b148faab38e0f46037ac0fdd70749948e0fa5391492bff624c1" Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.002172 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m8zw5" Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.323616 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:37:04 crc kubenswrapper[4795]: I0320 17:37:04.329202 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c74j8"] Mar 20 17:37:05 crc kubenswrapper[4795]: I0320 17:37:05.265878 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135e322f-177c-4bbb-bb3d-0ab19eba6f92" path="/var/lib/kubelet/pods/135e322f-177c-4bbb-bb3d-0ab19eba6f92/volumes" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.437048 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dnp2g" podUID="28df10bb-d6a9-47a9-9b79-0bb9665529ef" containerName="ovn-controller" probeResult="failure" output=< Mar 20 17:37:06 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 17:37:06 crc kubenswrapper[4795]: > Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.447300 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.481098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dsqcc" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.690222 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:06 crc kubenswrapper[4795]: E0320 17:37:06.690931 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerName="swift-ring-rebalance" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.690949 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerName="swift-ring-rebalance" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.691140 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c422574-0103-4c97-9e23-5a78c5b44e69" containerName="swift-ring-rebalance" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.691808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.694297 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.697271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.732707 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.733043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.733082 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.834905 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.835418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.835468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.835501 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.836939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.836987 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:06 crc kubenswrapper[4795]: I0320 17:37:06.865058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"ovn-controller-dnp2g-config-q9gtk\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:07 crc kubenswrapper[4795]: I0320 17:37:07.016766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:07 crc kubenswrapper[4795]: I0320 17:37:07.459230 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:07 crc kubenswrapper[4795]: W0320 17:37:07.468284 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod170d948e_372e_4b54_8ecf_c370d4b10acb.slice/crio-e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c WatchSource:0}: Error finding container e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c: Status 404 returned error can't find the container with id e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c Mar 20 17:37:08 crc kubenswrapper[4795]: I0320 17:37:08.040775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerStarted","Data":"b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb"} Mar 20 17:37:08 crc kubenswrapper[4795]: I0320 17:37:08.041032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerStarted","Data":"e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c"} Mar 20 17:37:08 crc kubenswrapper[4795]: I0320 17:37:08.064012 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dnp2g-config-q9gtk" podStartSLOduration=2.063983908 podStartE2EDuration="2.063983908s" podCreationTimestamp="2026-03-20 17:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:08.06049098 +0000 UTC m=+1171.518522531" watchObservedRunningTime="2026-03-20 17:37:08.063983908 +0000 UTC m=+1171.522015449" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.054160 4795 generic.go:334] "Generic (PLEG): container finished" podID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerID="b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb" exitCode=0 Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.054244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerDied","Data":"b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb"} Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.322498 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.324183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.326140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.330630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.390409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.390488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.492821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.492900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.495042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.516033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"root-account-create-update-d7ffs\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:09 crc kubenswrapper[4795]: I0320 17:37:09.646166 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:11 crc kubenswrapper[4795]: I0320 17:37:11.441190 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dnp2g" Mar 20 17:37:14 crc kubenswrapper[4795]: I0320 17:37:14.895316 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.001967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016079 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016214 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016288 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") pod \"170d948e-372e-4b54-8ecf-c370d4b10acb\" (UID: \"170d948e-372e-4b54-8ecf-c370d4b10acb\") " Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.002084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run" (OuterVolumeSpecName: "var-run") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016822 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016837 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.016860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.017295 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.017815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts" (OuterVolumeSpecName: "scripts") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.019912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s" (OuterVolumeSpecName: "kube-api-access-kb27s") pod "170d948e-372e-4b54-8ecf-c370d4b10acb" (UID: "170d948e-372e-4b54-8ecf-c370d4b10acb"). InnerVolumeSpecName "kube-api-access-kb27s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.110039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-q9gtk" event={"ID":"170d948e-372e-4b54-8ecf-c370d4b10acb","Type":"ContainerDied","Data":"e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c"} Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.110084 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-q9gtk" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.110099 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e75a3e425b4d637de8fadfede5c41df72f1a60faaa7e64035474eb6372518c6c" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118294 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118494 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/170d948e-372e-4b54-8ecf-c370d4b10acb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118635 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb27s\" (UniqueName: \"kubernetes.io/projected/170d948e-372e-4b54-8ecf-c370d4b10acb-kube-api-access-kb27s\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.118799 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/170d948e-372e-4b54-8ecf-c370d4b10acb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4795]: I0320 17:37:15.316051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:37:15 crc kubenswrapper[4795]: W0320 17:37:15.321468 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc63f125_2d90_43df_a863_b85fb2eb690e.slice/crio-261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab WatchSource:0}: Error finding container 261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab: Status 404 returned error can't find the container with id 261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.009317 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.019043 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dnp2g-config-q9gtk"] Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.120132 4795 generic.go:334] "Generic (PLEG): container finished" podID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerID="497d569160d86f7ef365c3f9c537432bd00933f71438ea39707377d46eebd046" exitCode=0 Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.120219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ffs" event={"ID":"fc63f125-2d90-43df-a863-b85fb2eb690e","Type":"ContainerDied","Data":"497d569160d86f7ef365c3f9c537432bd00933f71438ea39707377d46eebd046"} Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.120249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ffs" event={"ID":"fc63f125-2d90-43df-a863-b85fb2eb690e","Type":"ContainerStarted","Data":"261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab"} Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.122024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerStarted","Data":"470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865"} Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.136493 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:16 crc kubenswrapper[4795]: E0320 17:37:16.137346 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerName="ovn-config" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.137379 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerName="ovn-config" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.137636 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" containerName="ovn-config" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.138422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.140693 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.167079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.201778 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sv5fz" podStartSLOduration=2.824677256 podStartE2EDuration="15.20175712s" podCreationTimestamp="2026-03-20 17:37:01 +0000 UTC" firstStartedPulling="2026-03-20 17:37:02.523200991 +0000 UTC m=+1165.981232532" lastFinishedPulling="2026-03-20 17:37:14.900280855 +0000 UTC m=+1178.358312396" observedRunningTime="2026-03-20 17:37:16.200664167 +0000 UTC m=+1179.658695718" watchObservedRunningTime="2026-03-20 17:37:16.20175712 +0000 UTC m=+1179.659788681" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240798 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.240985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.342812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.343368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.343756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.343984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.344165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.345895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.350382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/88e955e5-ba7a-4582-9d52-40333fe21b7f-etc-swift\") pod \"swift-storage-0\" (UID: \"88e955e5-ba7a-4582-9d52-40333fe21b7f\") " pod="openstack/swift-storage-0" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.382028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"ovn-controller-dnp2g-config-2smjm\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.458100 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.524887 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:37:16 crc kubenswrapper[4795]: I0320 17:37:16.971618 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.134080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-2smjm" event={"ID":"846c5a67-a071-48ed-a9e7-67c62882835c","Type":"ContainerStarted","Data":"3bcfbfed1148451d8b161514050ae2bcb87b08356c696431428b3c727102e649"} Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.152587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:37:17 crc kubenswrapper[4795]: W0320 17:37:17.159581 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e955e5_ba7a_4582_9d52_40333fe21b7f.slice/crio-d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873 WatchSource:0}: Error finding container d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873: Status 404 returned error can't find the container with id d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873 Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.261292 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="170d948e-372e-4b54-8ecf-c370d4b10acb" path="/var/lib/kubelet/pods/170d948e-372e-4b54-8ecf-c370d4b10acb/volumes" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.417978 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.466358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") pod \"fc63f125-2d90-43df-a863-b85fb2eb690e\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.466467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") pod \"fc63f125-2d90-43df-a863-b85fb2eb690e\" (UID: \"fc63f125-2d90-43df-a863-b85fb2eb690e\") " Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.467235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc63f125-2d90-43df-a863-b85fb2eb690e" (UID: "fc63f125-2d90-43df-a863-b85fb2eb690e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.472158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj" (OuterVolumeSpecName: "kube-api-access-l4dsj") pod "fc63f125-2d90-43df-a863-b85fb2eb690e" (UID: "fc63f125-2d90-43df-a863-b85fb2eb690e"). InnerVolumeSpecName "kube-api-access-l4dsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.568150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4dsj\" (UniqueName: \"kubernetes.io/projected/fc63f125-2d90-43df-a863-b85fb2eb690e-kube-api-access-l4dsj\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:17 crc kubenswrapper[4795]: I0320 17:37:17.568202 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc63f125-2d90-43df-a863-b85fb2eb690e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.150351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d6c6ffeb06e45853c659218f8670f9cfcc8fae5ea919b1ca71d019ef6b506873"} Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.153611 4795 generic.go:334] "Generic (PLEG): container finished" podID="846c5a67-a071-48ed-a9e7-67c62882835c" containerID="9df2c204a93d51c554ceaf159d1f9366b95bed6cc7f2757ae8ae8edae396f498" exitCode=0 Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.153783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-2smjm" event={"ID":"846c5a67-a071-48ed-a9e7-67c62882835c","Type":"ContainerDied","Data":"9df2c204a93d51c554ceaf159d1f9366b95bed6cc7f2757ae8ae8edae396f498"} Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.157523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ffs" event={"ID":"fc63f125-2d90-43df-a863-b85fb2eb690e","Type":"ContainerDied","Data":"261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab"} Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.157568 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="261ab8cef626a620c36bd5d7508f9a4d9c3b9fbf6889a35c5573b9ac70eba7ab" Mar 20 17:37:18 crc kubenswrapper[4795]: I0320 17:37:18.157655 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ffs" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.172518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"e389a8ec0a888e7cfab4b7ccfee9700e454039da49b040fb106aef2e1974ba85"} Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.173061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"4e4d933685c4a127946eb2e1564e691c50d0f7065bc81b37becfeb97ca12f56e"} Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.173076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d73a955953d52c7f3cfb477122af22bc0efcd99762b19d80302a0895e38e12e1"} Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.412787 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522212 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522571 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522631 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") pod \"846c5a67-a071-48ed-a9e7-67c62882835c\" (UID: \"846c5a67-a071-48ed-a9e7-67c62882835c\") " Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.522829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run" (OuterVolumeSpecName: "var-run") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523845 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523912 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523934 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.523951 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/846c5a67-a071-48ed-a9e7-67c62882835c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.524450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts" (OuterVolumeSpecName: "scripts") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.527552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6" (OuterVolumeSpecName: "kube-api-access-fc8l6") pod "846c5a67-a071-48ed-a9e7-67c62882835c" (UID: "846c5a67-a071-48ed-a9e7-67c62882835c"). InnerVolumeSpecName "kube-api-access-fc8l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.625906 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/846c5a67-a071-48ed-a9e7-67c62882835c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4795]: I0320 17:37:19.625953 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8l6\" (UniqueName: \"kubernetes.io/projected/846c5a67-a071-48ed-a9e7-67c62882835c-kube-api-access-fc8l6\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.186676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"95b2b207073495de5bf016766565d4264f7643e8986bca31d5a886381b8a2a45"} Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.190768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dnp2g-config-2smjm" event={"ID":"846c5a67-a071-48ed-a9e7-67c62882835c","Type":"ContainerDied","Data":"3bcfbfed1148451d8b161514050ae2bcb87b08356c696431428b3c727102e649"} Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.190802 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bcfbfed1148451d8b161514050ae2bcb87b08356c696431428b3c727102e649" Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.190858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dnp2g-config-2smjm" Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.489354 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:20 crc kubenswrapper[4795]: I0320 17:37:20.496852 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dnp2g-config-2smjm"] Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.206019 4795 generic.go:334] "Generic (PLEG): container finished" podID="b8103489-e552-49b0-a32a-1069a46feff9" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" exitCode=0 Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.206175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerDied","Data":"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157"} Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.218020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"94fe70416360f160e0eeed043b9070653346046927e4bbd473bf4701703bed88"} Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.218106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d41110d0dd61954d11b2ddc28932de3ae795d26e4d019e191f0a7561f5b0b71d"} Mar 20 17:37:21 crc kubenswrapper[4795]: I0320 17:37:21.260531 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" path="/var/lib/kubelet/pods/846c5a67-a071-48ed-a9e7-67c62882835c/volumes" Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.235953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerStarted","Data":"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.236705 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.239185 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerID="5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea" exitCode=0 Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.239256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerDied","Data":"5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.244444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"f269d387c50769b37a9d2ea4c757b330becf11be54c6f550a125f2c2bb94b986"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.244497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"516ad9c443197cc9e8165a78721fae46f2a7508b0cdfea8e94ef814847317bad"} Mar 20 17:37:22 crc kubenswrapper[4795]: I0320 17:37:22.267772 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.34648388 podStartE2EDuration="1m26.267754148s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.796273767 +0000 UTC m=+1114.254305318" lastFinishedPulling="2026-03-20 17:36:47.717544005 +0000 UTC m=+1151.175575586" observedRunningTime="2026-03-20 17:37:22.262726242 +0000 UTC m=+1185.720757803" watchObservedRunningTime="2026-03-20 17:37:22.267754148 +0000 UTC m=+1185.725785689" Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.260975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"92ead5f56ac7347c810dd90b1560b35e81836d7f359a796b46a30a7c6c019707"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.261021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"d4618223e960efe7ce48a402fa7f9f6d73b91babeb1e98beebab62ac5781f4dc"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.263803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerStarted","Data":"930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.263971 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.266486 4795 generic.go:334] "Generic (PLEG): container finished" podID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerID="470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865" exitCode=0 Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.266586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerDied","Data":"470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865"} Mar 20 17:37:23 crc kubenswrapper[4795]: I0320 17:37:23.288852 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371949.56594 podStartE2EDuration="1m27.288835305s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="2026-03-20 17:36:10.998998659 +0000 UTC m=+1114.457030200" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:23.285323915 +0000 UTC m=+1186.743355466" watchObservedRunningTime="2026-03-20 17:37:23.288835305 +0000 UTC m=+1186.746866856" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.281033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"cf71f961fcc91b9e35fa81659d3f697bfb918bf0cbf6d4d3d897add69350217e"} Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.281366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"7dd41f0c40aeeab4fb0e2dda77fcf2039ae08488c188c3ac01c40ecd248916e7"} Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.656956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.708806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") pod \"e951c331-872c-41b6-b747-d5129b8c0a1b\" (UID: \"e951c331-872c-41b6-b747-d5129b8c0a1b\") " Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.724001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.728988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227" (OuterVolumeSpecName: "kube-api-access-pz227") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "kube-api-access-pz227". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.752634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.754218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data" (OuterVolumeSpecName: "config-data") pod "e951c331-872c-41b6-b747-d5129b8c0a1b" (UID: "e951c331-872c-41b6-b747-d5129b8c0a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810885 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810920 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810932 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e951c331-872c-41b6-b747-d5129b8c0a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4795]: I0320 17:37:24.810944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz227\" (UniqueName: \"kubernetes.io/projected/e951c331-872c-41b6-b747-d5129b8c0a1b-kube-api-access-pz227\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.288199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sv5fz" event={"ID":"e951c331-872c-41b6-b747-d5129b8c0a1b","Type":"ContainerDied","Data":"846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e"} Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.288536 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846e6d7fa128c4574d3ee937a320614ad8d6b0aea697784236cef95caf3bdb1e" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.288603 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sv5fz" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.801549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:25 crc kubenswrapper[4795]: E0320 17:37:25.801964 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerName="glance-db-sync" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802004 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerName="glance-db-sync" Mar 20 17:37:25 crc kubenswrapper[4795]: E0320 17:37:25.802028 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerName="mariadb-account-create-update" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerName="mariadb-account-create-update" Mar 20 17:37:25 crc kubenswrapper[4795]: E0320 17:37:25.802054 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" containerName="ovn-config" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802062 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" containerName="ovn-config" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" containerName="mariadb-account-create-update" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802263 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="846c5a67-a071-48ed-a9e7-67c62882835c" containerName="ovn-config" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.802282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" containerName="glance-db-sync" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.803254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.819993 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829885 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.829968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.931942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.932155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.933244 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:25 crc kubenswrapper[4795]: I0320 17:37:25.950546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"dnsmasq-dns-5b946c75cc-79n92\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:26 crc kubenswrapper[4795]: I0320 17:37:26.117549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:26 crc kubenswrapper[4795]: I0320 17:37:26.744333 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:27 crc kubenswrapper[4795]: I0320 17:37:27.303837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerStarted","Data":"aa3f1672f9f6d3489df066685a31982534a33df447f9e485fd30c0b9c6ecc887"} Mar 20 17:37:27 crc kubenswrapper[4795]: I0320 17:37:27.310260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"bf3ce42cae265bb99fb791f8eed17095fce85d87cfb2f7c86fb8f9b0bfe58bf9"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.319738 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1a56dca-ff60-46df-8582-70547b180198" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" exitCode=0 Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.319822 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerDied","Data":"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.328663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"f9b19475d31f465754f36521ef65528e71a402fa42bc145e49d11499c802e865"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.328724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"88e955e5-ba7a-4582-9d52-40333fe21b7f","Type":"ContainerStarted","Data":"2aac8e6c32a191322b5f706bcb756b20123a5381decfbbcdfa8f3b0ec3eb505b"} Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.423405 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.762671026 podStartE2EDuration="45.423390706s" podCreationTimestamp="2026-03-20 17:36:43 +0000 UTC" firstStartedPulling="2026-03-20 17:37:17.162393946 +0000 UTC m=+1180.620425487" lastFinishedPulling="2026-03-20 17:37:22.823113586 +0000 UTC m=+1186.281145167" observedRunningTime="2026-03-20 17:37:28.421894619 +0000 UTC m=+1191.879926170" watchObservedRunningTime="2026-03-20 17:37:28.423390706 +0000 UTC m=+1191.881422247" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.682505 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.709659 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.711188 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.722053 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.733372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.779956 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.780038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.881954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.882008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.882832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.883304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.884059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.884533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.885058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:28 crc kubenswrapper[4795]: I0320 17:37:28.910329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"dnsmasq-dns-74f6bcbc87-g56fp\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.027340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.337848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerStarted","Data":"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683"} Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.338120 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.355161 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" podStartSLOduration=4.355140811 podStartE2EDuration="4.355140811s" podCreationTimestamp="2026-03-20 17:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:29.351979173 +0000 UTC m=+1192.810010714" watchObservedRunningTime="2026-03-20 17:37:29.355140811 +0000 UTC m=+1192.813172352" Mar 20 17:37:29 crc kubenswrapper[4795]: I0320 17:37:29.574362 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.348877 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed82521a-8a78-4611-870f-5ad53625bddf" containerID="99144a28a69d3c6fb3d096bda447f3ff1d42028233b58205725db4c507c27464" exitCode=0 Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.348937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerDied","Data":"99144a28a69d3c6fb3d096bda447f3ff1d42028233b58205725db4c507c27464"} Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.349403 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" containerID="cri-o://3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" gracePeriod=10 Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.349458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerStarted","Data":"4bbbefbb238ecd4d186023f5577494b794cb2bb3b7f5edef795d647b067a660b"} Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.743029 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.911211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912015 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.912225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") pod \"c1a56dca-ff60-46df-8582-70547b180198\" (UID: \"c1a56dca-ff60-46df-8582-70547b180198\") " Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.919759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd" (OuterVolumeSpecName: "kube-api-access-m2svd") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "kube-api-access-m2svd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.970464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.972362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.976658 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config" (OuterVolumeSpecName: "config") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:30 crc kubenswrapper[4795]: I0320 17:37:30.977738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1a56dca-ff60-46df-8582-70547b180198" (UID: "c1a56dca-ff60-46df-8582-70547b180198"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015281 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015334 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015346 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015359 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1a56dca-ff60-46df-8582-70547b180198-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.015370 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2svd\" (UniqueName: \"kubernetes.io/projected/c1a56dca-ff60-46df-8582-70547b180198-kube-api-access-m2svd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.360010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerStarted","Data":"46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481"} Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.360731 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363097 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1a56dca-ff60-46df-8582-70547b180198" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" exitCode=0 Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363136 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerDied","Data":"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683"} Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" event={"ID":"c1a56dca-ff60-46df-8582-70547b180198","Type":"ContainerDied","Data":"aa3f1672f9f6d3489df066685a31982534a33df447f9e485fd30c0b9c6ecc887"} Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363173 4795 scope.go:117] "RemoveContainer" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.363212 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-79n92" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.381371 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podStartSLOduration=3.381350408 podStartE2EDuration="3.381350408s" podCreationTimestamp="2026-03-20 17:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:31.379659906 +0000 UTC m=+1194.837691537" watchObservedRunningTime="2026-03-20 17:37:31.381350408 +0000 UTC m=+1194.839381979" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.457830 4795 scope.go:117] "RemoveContainer" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.463759 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.467311 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-79n92"] Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.494584 4795 scope.go:117] "RemoveContainer" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" Mar 20 17:37:31 crc kubenswrapper[4795]: E0320 17:37:31.494985 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683\": container with ID starting with 3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683 not found: ID does not exist" containerID="3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.495026 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683"} err="failed to get container status \"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683\": rpc error: code = NotFound desc = could not find container \"3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683\": container with ID starting with 3fc5c5c166ab50da391548cd6fa7eb73d1acc720ef80947f25cf207174717683 not found: ID does not exist" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.495053 4795 scope.go:117] "RemoveContainer" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" Mar 20 17:37:31 crc kubenswrapper[4795]: E0320 17:37:31.495458 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7\": container with ID starting with 2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7 not found: ID does not exist" containerID="2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7" Mar 20 17:37:31 crc kubenswrapper[4795]: I0320 17:37:31.495480 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7"} err="failed to get container status \"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7\": rpc error: code = NotFound desc = could not find container \"2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7\": container with ID starting with 2116311d1770d9550bf63fdbf10b200f65c26cb63836bf4506fc34d95cd4b5a7 not found: ID does not exist" Mar 20 17:37:33 crc kubenswrapper[4795]: I0320 17:37:33.264017 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a56dca-ff60-46df-8582-70547b180198" path="/var/lib/kubelet/pods/c1a56dca-ff60-46df-8582-70547b180198/volumes" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.502041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.834904 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.835823 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:37:37 crc kubenswrapper[4795]: E0320 17:37:37.836140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="init" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836156 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="init" Mar 20 17:37:37 crc kubenswrapper[4795]: E0320 17:37:37.836176 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836183 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836338 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a56dca-ff60-46df-8582-70547b180198" containerName="dnsmasq-dns" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.836791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.852437 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.937993 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.938074 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.987515 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.988469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:37 crc kubenswrapper[4795]: I0320 17:37:37.991470 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.004383 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.040776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.040824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.041489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.112070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"cinder-db-create-b5nks\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.143526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.143598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.144711 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.146064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.161778 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.168588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.223200 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.224325 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.247485 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.247801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.247919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.248006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.248791 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.249118 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.250090 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.257305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.257960 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.268833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.283997 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.284918 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.286522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.287045 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.287463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.287668 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.294635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"cinder-fe40-account-create-update-jh9t8\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.294725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.302279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.349637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.349670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.349722 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.350131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.350258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.350302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.360549 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.368396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"barbican-db-create-2xqcf\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.438286 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.440030 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.443056 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.451539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.452394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.453179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.457706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.466841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"neutron-db-create-d5tx6\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.471004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"neutron-c60c-account-create-update-hhzrt\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.479038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.505540 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.554521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.554962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.555025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.555050 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.555094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.559953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.561836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.575448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"keystone-db-sync-dwhh5\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.656150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.656227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.656981 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.668807 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.671900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"barbican-0ac3-account-create-update-km4zq\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.680808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.688669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.763106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.788939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:37:38 crc kubenswrapper[4795]: W0320 17:37:38.808409 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff9ec79_6bd9_470e_8a75_8df1f3c52851.slice/crio-5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b WatchSource:0}: Error finding container 5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b: Status 404 returned error can't find the container with id 5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b Mar 20 17:37:38 crc kubenswrapper[4795]: I0320 17:37:38.911523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:37:38 crc kubenswrapper[4795]: W0320 17:37:38.935392 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode065e2d4_096b_426b_a1f8_14311adb7cbc.slice/crio-b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984 WatchSource:0}: Error finding container b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984: Status 404 returned error can't find the container with id b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.027809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.112075 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.112322 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-8grln" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" containerID="cri-o://7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc" gracePeriod=10 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.149824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.212703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.221967 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:37:39 crc kubenswrapper[4795]: W0320 17:37:39.272018 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b15c724_622b_4da7_96a3_01949d04ecac.slice/crio-aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565 WatchSource:0}: Error finding container aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565: Status 404 returned error can't find the container with id aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.401786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:37:39 crc kubenswrapper[4795]: W0320 17:37:39.451924 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c0847a4_54b5_4068_bfa8_730a19e96d9c.slice/crio-c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9 WatchSource:0}: Error finding container c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9: Status 404 returned error can't find the container with id c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.454728 4795 generic.go:334] "Generic (PLEG): container finished" podID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerID="7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.454776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerDied","Data":"7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.459020 4795 generic.go:334] "Generic (PLEG): container finished" podID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerID="0bd7daab0116804ff4450e365b81c7b208a20da2cd4b665ce83729724da32638" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.459087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2xqcf" event={"ID":"e065e2d4-096b-426b-a1f8-14311adb7cbc","Type":"ContainerDied","Data":"0bd7daab0116804ff4450e365b81c7b208a20da2cd4b665ce83729724da32638"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.459118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2xqcf" event={"ID":"e065e2d4-096b-426b-a1f8-14311adb7cbc","Type":"ContainerStarted","Data":"b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.461244 4795 generic.go:334] "Generic (PLEG): container finished" podID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerID="d85d02558764daff4d2300daa1f7a51dd79d0b89452cbb1821643bd1f3d0ff3c" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.461296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b5nks" event={"ID":"373ddf98-d9da-4f1f-a6be-3d16e3cbad57","Type":"ContainerDied","Data":"d85d02558764daff4d2300daa1f7a51dd79d0b89452cbb1821643bd1f3d0ff3c"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.461314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b5nks" event={"ID":"373ddf98-d9da-4f1f-a6be-3d16e3cbad57","Type":"ContainerStarted","Data":"01df350322d624589cd020be71f4012590e18461b49a99764038fdca3981f6bb"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.463145 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerID="0f5ef18005b655abcc8e4883b9bee8538648f3cf86fe68a6e17cb1ecb194c52e" exitCode=0 Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.463210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fe40-account-create-update-jh9t8" event={"ID":"9ff9ec79-6bd9-470e-8a75-8df1f3c52851","Type":"ContainerDied","Data":"0f5ef18005b655abcc8e4883b9bee8538648f3cf86fe68a6e17cb1ecb194c52e"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.463229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fe40-account-create-update-jh9t8" event={"ID":"9ff9ec79-6bd9-470e-8a75-8df1f3c52851","Type":"ContainerStarted","Data":"5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.464415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c60c-account-create-update-hhzrt" event={"ID":"36d71698-4dc2-448a-9330-23372e2d508b","Type":"ContainerStarted","Data":"27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.468120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerStarted","Data":"aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.469098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d5tx6" event={"ID":"18b1c5f0-e7fb-44b7-8c75-c8036f371c56","Type":"ContainerStarted","Data":"a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736"} Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.557515 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672093 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672169 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.672209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") pod \"601af69d-c03f-4bdf-b3bf-67ba791674f9\" (UID: \"601af69d-c03f-4bdf-b3bf-67ba791674f9\") " Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.677411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4" (OuterVolumeSpecName: "kube-api-access-xmkt4") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "kube-api-access-xmkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.719904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config" (OuterVolumeSpecName: "config") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.722524 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.724915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.729339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "601af69d-c03f-4bdf-b3bf-67ba791674f9" (UID: "601af69d-c03f-4bdf-b3bf-67ba791674f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774155 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774416 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774432 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmkt4\" (UniqueName: \"kubernetes.io/projected/601af69d-c03f-4bdf-b3bf-67ba791674f9-kube-api-access-xmkt4\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774443 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:39 crc kubenswrapper[4795]: I0320 17:37:39.774456 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/601af69d-c03f-4bdf-b3bf-67ba791674f9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.481084 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerID="e0dfbaccbeeb5b8f99fb5498e364810e7c89661123cf8b487af69c7d6020134e" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.481186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ac3-account-create-update-km4zq" event={"ID":"1c0847a4-54b5-4068-bfa8-730a19e96d9c","Type":"ContainerDied","Data":"e0dfbaccbeeb5b8f99fb5498e364810e7c89661123cf8b487af69c7d6020134e"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.481237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ac3-account-create-update-km4zq" event={"ID":"1c0847a4-54b5-4068-bfa8-730a19e96d9c","Type":"ContainerStarted","Data":"c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.483996 4795 generic.go:334] "Generic (PLEG): container finished" podID="36d71698-4dc2-448a-9330-23372e2d508b" containerID="3e3072d7a6a60ff440da8ec24082885e62958e6ce5ded9fd9910a3d0c2817a07" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.484063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c60c-account-create-update-hhzrt" event={"ID":"36d71698-4dc2-448a-9330-23372e2d508b","Type":"ContainerDied","Data":"3e3072d7a6a60ff440da8ec24082885e62958e6ce5ded9fd9910a3d0c2817a07"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.486436 4795 generic.go:334] "Generic (PLEG): container finished" podID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerID="5d693b98a616da996bc733e3508b576b31f68d8eb1c9fc7b9800283fac04b343" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.486501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d5tx6" event={"ID":"18b1c5f0-e7fb-44b7-8c75-c8036f371c56","Type":"ContainerDied","Data":"5d693b98a616da996bc733e3508b576b31f68d8eb1c9fc7b9800283fac04b343"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.490508 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-8grln" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.493721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-8grln" event={"ID":"601af69d-c03f-4bdf-b3bf-67ba791674f9","Type":"ContainerDied","Data":"f49aeeb3f2eaf36f87117fd9bb4fc971651fbd9ce295830a95f52f7faa0753d6"} Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.493820 4795 scope.go:117] "RemoveContainer" containerID="7043047e88a0378017830c2f9e0915780f4eef0732a290fd3459875c42d1f7cc" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.531934 4795 scope.go:117] "RemoveContainer" containerID="a43865e4251904d08d5c0655d2fd65e83c2843454a9cfbf7734d7aa91dad11f3" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.568130 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.574672 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-8grln"] Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.883327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.969185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.973758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.994647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") pod \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.994737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") pod \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\" (UID: \"9ff9ec79-6bd9-470e-8a75-8df1f3c52851\") " Mar 20 17:37:40 crc kubenswrapper[4795]: I0320 17:37:40.996162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ff9ec79-6bd9-470e-8a75-8df1f3c52851" (UID: "9ff9ec79-6bd9-470e-8a75-8df1f3c52851"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.004062 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv" (OuterVolumeSpecName: "kube-api-access-p6cxv") pod "9ff9ec79-6bd9-470e-8a75-8df1f3c52851" (UID: "9ff9ec79-6bd9-470e-8a75-8df1f3c52851"). InnerVolumeSpecName "kube-api-access-p6cxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") pod \"e065e2d4-096b-426b-a1f8-14311adb7cbc\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096807 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") pod \"e065e2d4-096b-426b-a1f8-14311adb7cbc\" (UID: \"e065e2d4-096b-426b-a1f8-14311adb7cbc\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") pod \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.096882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") pod \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\" (UID: \"373ddf98-d9da-4f1f-a6be-3d16e3cbad57\") " Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097301 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097320 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6cxv\" (UniqueName: \"kubernetes.io/projected/9ff9ec79-6bd9-470e-8a75-8df1f3c52851-kube-api-access-p6cxv\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e065e2d4-096b-426b-a1f8-14311adb7cbc" (UID: "e065e2d4-096b-426b-a1f8-14311adb7cbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.097635 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "373ddf98-d9da-4f1f-a6be-3d16e3cbad57" (UID: "373ddf98-d9da-4f1f-a6be-3d16e3cbad57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.101506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn" (OuterVolumeSpecName: "kube-api-access-tgnfn") pod "e065e2d4-096b-426b-a1f8-14311adb7cbc" (UID: "e065e2d4-096b-426b-a1f8-14311adb7cbc"). InnerVolumeSpecName "kube-api-access-tgnfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.102412 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn" (OuterVolumeSpecName: "kube-api-access-c7vqn") pod "373ddf98-d9da-4f1f-a6be-3d16e3cbad57" (UID: "373ddf98-d9da-4f1f-a6be-3d16e3cbad57"). InnerVolumeSpecName "kube-api-access-c7vqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.198965 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgnfn\" (UniqueName: \"kubernetes.io/projected/e065e2d4-096b-426b-a1f8-14311adb7cbc-kube-api-access-tgnfn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.199040 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7vqn\" (UniqueName: \"kubernetes.io/projected/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-kube-api-access-c7vqn\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.199050 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ddf98-d9da-4f1f-a6be-3d16e3cbad57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.199059 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e065e2d4-096b-426b-a1f8-14311adb7cbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.262817 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" path="/var/lib/kubelet/pods/601af69d-c03f-4bdf-b3bf-67ba791674f9/volumes" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.299643 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.299705 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.499957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2xqcf" event={"ID":"e065e2d4-096b-426b-a1f8-14311adb7cbc","Type":"ContainerDied","Data":"b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984"} Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.499999 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27f152f7a114e2e7e93e92923c0b393d8e432ae7e69a213718b59a66a710984" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.500031 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2xqcf" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.502345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b5nks" event={"ID":"373ddf98-d9da-4f1f-a6be-3d16e3cbad57","Type":"ContainerDied","Data":"01df350322d624589cd020be71f4012590e18461b49a99764038fdca3981f6bb"} Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.502389 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01df350322d624589cd020be71f4012590e18461b49a99764038fdca3981f6bb" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.502393 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b5nks" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.504292 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fe40-account-create-update-jh9t8" Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.504286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fe40-account-create-update-jh9t8" event={"ID":"9ff9ec79-6bd9-470e-8a75-8df1f3c52851","Type":"ContainerDied","Data":"5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b"} Mar 20 17:37:41 crc kubenswrapper[4795]: I0320 17:37:41.504479 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bee5140fa059f686560e98c3b2dbc013af9d6b0529b08e35ba971a75819852b" Mar 20 17:37:43 crc kubenswrapper[4795]: I0320 17:37:43.937087 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:43 crc kubenswrapper[4795]: I0320 17:37:43.942807 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:43 crc kubenswrapper[4795]: I0320 17:37:43.948080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.066912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") pod \"36d71698-4dc2-448a-9330-23372e2d508b\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.066988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") pod \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") pod \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\" (UID: \"1c0847a4-54b5-4068-bfa8-730a19e96d9c\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067080 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") pod \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067122 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") pod \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\" (UID: \"18b1c5f0-e7fb-44b7-8c75-c8036f371c56\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067161 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") pod \"36d71698-4dc2-448a-9330-23372e2d508b\" (UID: \"36d71698-4dc2-448a-9330-23372e2d508b\") " Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067400 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36d71698-4dc2-448a-9330-23372e2d508b" (UID: "36d71698-4dc2-448a-9330-23372e2d508b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067881 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d71698-4dc2-448a-9330-23372e2d508b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18b1c5f0-e7fb-44b7-8c75-c8036f371c56" (UID: "18b1c5f0-e7fb-44b7-8c75-c8036f371c56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.067979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c0847a4-54b5-4068-bfa8-730a19e96d9c" (UID: "1c0847a4-54b5-4068-bfa8-730a19e96d9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.071134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq" (OuterVolumeSpecName: "kube-api-access-6hdwq") pod "18b1c5f0-e7fb-44b7-8c75-c8036f371c56" (UID: "18b1c5f0-e7fb-44b7-8c75-c8036f371c56"). InnerVolumeSpecName "kube-api-access-6hdwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.071165 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4" (OuterVolumeSpecName: "kube-api-access-l97q4") pod "1c0847a4-54b5-4068-bfa8-730a19e96d9c" (UID: "1c0847a4-54b5-4068-bfa8-730a19e96d9c"). InnerVolumeSpecName "kube-api-access-l97q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.072914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r" (OuterVolumeSpecName: "kube-api-access-t5j4r") pod "36d71698-4dc2-448a-9330-23372e2d508b" (UID: "36d71698-4dc2-448a-9330-23372e2d508b"). InnerVolumeSpecName "kube-api-access-t5j4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169329 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l97q4\" (UniqueName: \"kubernetes.io/projected/1c0847a4-54b5-4068-bfa8-730a19e96d9c-kube-api-access-l97q4\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169729 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c0847a4-54b5-4068-bfa8-730a19e96d9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169743 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169754 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hdwq\" (UniqueName: \"kubernetes.io/projected/18b1c5f0-e7fb-44b7-8c75-c8036f371c56-kube-api-access-6hdwq\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.169764 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5j4r\" (UniqueName: \"kubernetes.io/projected/36d71698-4dc2-448a-9330-23372e2d508b-kube-api-access-t5j4r\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.544583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerStarted","Data":"5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.548947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d5tx6" event={"ID":"18b1c5f0-e7fb-44b7-8c75-c8036f371c56","Type":"ContainerDied","Data":"a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.549070 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.549265 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d5tx6" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.559309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0ac3-account-create-update-km4zq" event={"ID":"1c0847a4-54b5-4068-bfa8-730a19e96d9c","Type":"ContainerDied","Data":"c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.559354 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a78ee8d53aabca80a59d0870d7daa98b478b2dec2aebdb7d671b62a593cec9" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.559429 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0ac3-account-create-update-km4zq" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.575487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c60c-account-create-update-hhzrt" event={"ID":"36d71698-4dc2-448a-9330-23372e2d508b","Type":"ContainerDied","Data":"27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab"} Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.575553 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.575595 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c60c-account-create-update-hhzrt" Mar 20 17:37:44 crc kubenswrapper[4795]: I0320 17:37:44.580102 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dwhh5" podStartSLOduration=1.915128879 podStartE2EDuration="6.580073701s" podCreationTimestamp="2026-03-20 17:37:38 +0000 UTC" firstStartedPulling="2026-03-20 17:37:39.274786895 +0000 UTC m=+1202.732818436" lastFinishedPulling="2026-03-20 17:37:43.939731717 +0000 UTC m=+1207.397763258" observedRunningTime="2026-03-20 17:37:44.572085392 +0000 UTC m=+1208.030117003" watchObservedRunningTime="2026-03-20 17:37:44.580073701 +0000 UTC m=+1208.038105272" Mar 20 17:37:44 crc kubenswrapper[4795]: E0320 17:37:44.790095 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c0847a4_54b5_4068_bfa8_730a19e96d9c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b1c5f0_e7fb_44b7_8c75_c8036f371c56.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d71698_4dc2_448a_9330_23372e2d508b.slice/crio-27f218d293aef7485b27dc5eda4c8db8ebcf99d106c06e80ce6dd66bae5fccab\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d71698_4dc2_448a_9330_23372e2d508b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b1c5f0_e7fb_44b7_8c75_c8036f371c56.slice/crio-a2be0f1e436101489f20c15d229127ac688cb2bc6666ffbfaa6ce5f6656ba736\": RecentStats: unable to find data in memory cache]" Mar 20 17:37:47 crc kubenswrapper[4795]: I0320 17:37:47.612022 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b15c724-622b-4da7-96a3-01949d04ecac" containerID="5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0" exitCode=0 Mar 20 17:37:47 crc kubenswrapper[4795]: I0320 17:37:47.612135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerDied","Data":"5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0"} Mar 20 17:37:48 crc kubenswrapper[4795]: I0320 17:37:48.953915 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.056329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") pod \"7b15c724-622b-4da7-96a3-01949d04ecac\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.056503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") pod \"7b15c724-622b-4da7-96a3-01949d04ecac\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.056596 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") pod \"7b15c724-622b-4da7-96a3-01949d04ecac\" (UID: \"7b15c724-622b-4da7-96a3-01949d04ecac\") " Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.061512 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj" (OuterVolumeSpecName: "kube-api-access-6lmwj") pod "7b15c724-622b-4da7-96a3-01949d04ecac" (UID: "7b15c724-622b-4da7-96a3-01949d04ecac"). InnerVolumeSpecName "kube-api-access-6lmwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.082962 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b15c724-622b-4da7-96a3-01949d04ecac" (UID: "7b15c724-622b-4da7-96a3-01949d04ecac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.109084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data" (OuterVolumeSpecName: "config-data") pod "7b15c724-622b-4da7-96a3-01949d04ecac" (UID: "7b15c724-622b-4da7-96a3-01949d04ecac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.159461 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.159529 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lmwj\" (UniqueName: \"kubernetes.io/projected/7b15c724-622b-4da7-96a3-01949d04ecac-kube-api-access-6lmwj\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.159558 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15c724-622b-4da7-96a3-01949d04ecac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.636811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dwhh5" event={"ID":"7b15c724-622b-4da7-96a3-01949d04ecac","Type":"ContainerDied","Data":"aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565"} Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.636856 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa13071895e9b08b2f6b3daba25e91e30ed5814d0e07709ea9470a7ceab46565" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.636876 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dwhh5" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865024 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865430 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" containerName="keystone-db-sync" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865452 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" containerName="keystone-db-sync" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865479 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d71698-4dc2-448a-9330-23372e2d508b" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865487 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d71698-4dc2-448a-9330-23372e2d508b" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865504 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865518 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865526 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865542 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865561 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="init" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="init" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865579 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865604 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865611 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: E0320 17:37:49.865623 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865630 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865945 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" containerName="keystone-db-sync" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865975 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.865990 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866005 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866013 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d71698-4dc2-448a-9330-23372e2d508b" containerName="mariadb-account-create-update" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866021 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="601af69d-c03f-4bdf-b3bf-67ba791674f9" containerName="dnsmasq-dns" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.866030 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" containerName="mariadb-database-create" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.867053 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.885347 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.909046 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.910198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.912108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918216 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.918640 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.943751 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.974988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:49 crc kubenswrapper[4795]: I0320 17:37:49.975006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.053306 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.063268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073235 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073486 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mdfb4" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073604 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.073717 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076785 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.076823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.077615 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.078122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.088412 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.091515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.091953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.092104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.097983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.098222 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.098506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.099480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.103262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.107401 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.110251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.113381 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qbkvb" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.113703 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.113868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.126865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.132320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"dnsmasq-dns-847c4cc679-sj85m\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.133937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"keystone-bootstrap-6kxnf\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179722 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.179913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.188067 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.188626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.205496 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.207153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.208880 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.217268 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.218473 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.223734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m5c4m" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.223893 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.224098 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.224276 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qdq8q" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.226863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.246640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.247950 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.280460 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.281363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.282350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289393 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289593 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.289912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290212 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290974 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.291058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.290028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.293094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.296130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.297881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.298008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.299010 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.306023 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.311385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"neutron-db-sync-7mx5b\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.330550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.337366 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"horizon-777644b489-7th7n\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.347228 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.348191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.353330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.353557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qhmpx" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.393998 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.394368 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.397552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.399141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.405428 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.406914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.408249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.418785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.419392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.425899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.426340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.433430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.433665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"cinder-db-sync-rdxps\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.451955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.459297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"placement-db-sync-nfr5n\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.496373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.497364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.497593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.498144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.499578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.499644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.520455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"dnsmasq-dns-785d8bcb8c-dlkxl\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.520524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.521052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.546021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.552323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.595628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.598200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.602874 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.603125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.604603 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.609554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.610245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.613318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.627941 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.629871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.631516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"barbican-db-sync-4t68k\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632058 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s6lrt" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632225 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.632631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.656134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.656436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.671024 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.697181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706840 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.706999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.707983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.709039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.709292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.718148 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.719318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.743621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"horizon-6f84bdc6f9-rj454\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.808922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.811300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.820624 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.820895 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.822312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.822601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.822660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.825991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.843458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.846199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.848852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.855456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.855667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.856320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.856862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"ceilometer-0\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.857089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.872600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.867667 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.878761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.909432 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.921087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:50 crc kubenswrapper[4795]: I0320 17:37:50.963342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.125237 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.209600 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.214202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.217554 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.217759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.229646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.251110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:51 crc kubenswrapper[4795]: W0320 17:37:51.297218 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1074ea_5432_46f8_ba74_7c68912c68b6.slice/crio-715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6 WatchSource:0}: Error finding container 715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6: Status 404 returned error can't find the container with id 715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.320996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.321011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422970 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422967 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.422985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.423030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.423050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.423772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.424046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.430993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.431199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.432164 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.435055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.446444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.519671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.533296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.706079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.714134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerStarted","Data":"43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.714179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerStarted","Data":"c771cb8aaa4f06cd374656dadf93993f0cd60baafd23685f3873cdc0a25a81b2"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.721610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerStarted","Data":"788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.721664 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerStarted","Data":"6a73d63fdc0ea281653f757d79fef4c1bc3b9c3ec1ea1387b57d51713a411b61"} Mar 20 17:37:51 crc kubenswrapper[4795]: W0320 17:37:51.736112 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod706c47a0_7763_44af_9b14_0e5322a8f2f1.slice/crio-d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5 WatchSource:0}: Error finding container d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5: Status 404 returned error can't find the container with id d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.736613 4795 generic.go:334] "Generic (PLEG): container finished" podID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerID="2241caa74db1d0488eb5dd2d754e106b2fc62040b64e0030ffbe349d4b865937" exitCode=0 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.736667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" event={"ID":"48116dbd-882f-4c5e-a8fe-4bea9195e73b","Type":"ContainerDied","Data":"2241caa74db1d0488eb5dd2d754e106b2fc62040b64e0030ffbe349d4b865937"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.736705 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" event={"ID":"48116dbd-882f-4c5e-a8fe-4bea9195e73b","Type":"ContainerStarted","Data":"bf06f2e3ef91158153acc1706941dba305656126f07ac38a5867119c1c7f7e8b"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.741287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerStarted","Data":"715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6"} Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.755232 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.765824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.767385 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7mx5b" podStartSLOduration=1.767358215 podStartE2EDuration="1.767358215s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:51.741334384 +0000 UTC m=+1215.199365915" watchObservedRunningTime="2026-03-20 17:37:51.767358215 +0000 UTC m=+1215.225389756" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.789393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:37:51 crc kubenswrapper[4795]: W0320 17:37:51.811871 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf79c1ee6_f8b4_485c_ac9e_667a09868206.slice/crio-f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1 WatchSource:0}: Error finding container f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1: Status 404 returned error can't find the container with id f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1 Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.827596 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.851667 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6kxnf" podStartSLOduration=2.851625387 podStartE2EDuration="2.851625387s" podCreationTimestamp="2026-03-20 17:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:51.772104842 +0000 UTC m=+1215.230136393" watchObservedRunningTime="2026-03-20 17:37:51.851625387 +0000 UTC m=+1215.309656948" Mar 20 17:37:51 crc kubenswrapper[4795]: I0320 17:37:51.928153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.052053 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.078876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.107917 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.156519 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.168142 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.169797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.223398 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.265873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.266454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.267191 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.370668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.371040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.373099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.374423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.388430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.416058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"horizon-fcd89d897-nsn69\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.419302 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.520093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.541967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581346 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.581495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") pod \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\" (UID: \"48116dbd-882f-4c5e-a8fe-4bea9195e73b\") " Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.590614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk" (OuterVolumeSpecName: "kube-api-access-qjtmk") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "kube-api-access-qjtmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.608876 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config" (OuterVolumeSpecName: "config") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.609704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.610497 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.616076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.622275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48116dbd-882f-4c5e-a8fe-4bea9195e73b" (UID: "48116dbd-882f-4c5e-a8fe-4bea9195e73b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689657 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689937 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689947 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689959 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689968 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjtmk\" (UniqueName: \"kubernetes.io/projected/48116dbd-882f-4c5e-a8fe-4bea9195e73b-kube-api-access-qjtmk\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.689975 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48116dbd-882f-4c5e-a8fe-4bea9195e73b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.755506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"d2d85da431f6c738cb4aa9ee890ff4d70deedc277e0a8410951bda0e019d69a8"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.757248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f84bdc6f9-rj454" event={"ID":"c2e1702a-166c-4c2e-9c39-d32a62528a89","Type":"ContainerStarted","Data":"49b0337f706e9262134a18e9a4347becf3d1ef94349c1c8c94395df30d906dd2"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.760052 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerStarted","Data":"eaa9eee2e882516c5d4ae5df7684d52bf42c7eec92e061674b1b8ad393538f60"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.764054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" event={"ID":"48116dbd-882f-4c5e-a8fe-4bea9195e73b","Type":"ContainerDied","Data":"bf06f2e3ef91158153acc1706941dba305656126f07ac38a5867119c1c7f7e8b"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.764155 4795 scope.go:117] "RemoveContainer" containerID="2241caa74db1d0488eb5dd2d754e106b2fc62040b64e0030ffbe349d4b865937" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.764288 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-sj85m" Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.781308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerStarted","Data":"d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.796032 4795 generic.go:334] "Generic (PLEG): container finished" podID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerID="65673e010192e9af6a054b2e6fafb5d1f1505b377d27e64bdbfd06c2c8d1a1c2" exitCode=0 Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.796185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerDied","Data":"65673e010192e9af6a054b2e6fafb5d1f1505b377d27e64bdbfd06c2c8d1a1c2"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.796212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerStarted","Data":"f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.800113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerStarted","Data":"5feb2d488b80d85ec78d6982b03a6895822f924d5e7cd9f2b1dd64d4c4e88e67"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.807440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerStarted","Data":"88a328785f37d83e7d6391b28b27d2d2c6fdbb3c3985b1505819228d225ea6fa"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.813151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerStarted","Data":"a9bb6c0d2645cf89398ba5e524f48e4c91c8aba913f48a27e0f35d07ec1b7929"} Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.852661 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:52 crc kubenswrapper[4795]: I0320 17:37:52.861061 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-sj85m"] Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.172061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.274955 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" path="/var/lib/kubelet/pods/48116dbd-882f-4c5e-a8fe-4bea9195e73b/volumes" Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.825440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerStarted","Data":"02a106435121a29bb7e883006bb45d54dcf75dcebd8e8d213a1788cfe4f4db42"} Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.828256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerStarted","Data":"fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18"} Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.829227 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.830959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerStarted","Data":"8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46"} Mar 20 17:37:53 crc kubenswrapper[4795]: I0320 17:37:53.849796 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" podStartSLOduration=3.849772281 podStartE2EDuration="3.849772281s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:53.842862726 +0000 UTC m=+1217.300894257" watchObservedRunningTime="2026-03-20 17:37:53.849772281 +0000 UTC m=+1217.307803822" Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.863964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerStarted","Data":"17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491"} Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.916026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerStarted","Data":"806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc"} Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.916804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" containerID="cri-o://8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46" gracePeriod=30 Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.917206 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" containerID="cri-o://806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc" gracePeriod=30 Mar 20 17:37:54 crc kubenswrapper[4795]: I0320 17:37:54.976905 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.976880158 podStartE2EDuration="4.976880158s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:54.963981517 +0000 UTC m=+1218.422013078" watchObservedRunningTime="2026-03-20 17:37:54.976880158 +0000 UTC m=+1218.434911699" Mar 20 17:37:55 crc kubenswrapper[4795]: E0320 17:37:55.040737 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db155e7_c2bd_430d_b59f_895fce359c51.slice/crio-806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db155e7_c2bd_430d_b59f_895fce359c51.slice/crio-conmon-806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.971199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerStarted","Data":"9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7"} Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.971557 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" containerID="cri-o://17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491" gracePeriod=30 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.972012 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" containerID="cri-o://9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7" gracePeriod=30 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.997735 4795 generic.go:334] "Generic (PLEG): container finished" podID="6db155e7-c2bd-430d-b59f-895fce359c51" containerID="806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc" exitCode=143 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.998085 4795 generic.go:334] "Generic (PLEG): container finished" podID="6db155e7-c2bd-430d-b59f-895fce359c51" containerID="8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46" exitCode=143 Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.997819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerDied","Data":"806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc"} Mar 20 17:37:55 crc kubenswrapper[4795]: I0320 17:37:55.998148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerDied","Data":"8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46"} Mar 20 17:37:56 crc kubenswrapper[4795]: I0320 17:37:56.038901 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.038878439 podStartE2EDuration="6.038878439s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:56.011998942 +0000 UTC m=+1219.470030493" watchObservedRunningTime="2026-03-20 17:37:56.038878439 +0000 UTC m=+1219.496909980" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.011805 4795 generic.go:334] "Generic (PLEG): container finished" podID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerID="788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c" exitCode=0 Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.011906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerDied","Data":"788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c"} Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.015948 4795 generic.go:334] "Generic (PLEG): container finished" podID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerID="9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7" exitCode=0 Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.015980 4795 generic.go:334] "Generic (PLEG): container finished" podID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerID="17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491" exitCode=143 Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.016007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerDied","Data":"9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7"} Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.016036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerDied","Data":"17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491"} Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.601664 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704423 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704544 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6db155e7-c2bd-430d-b59f-895fce359c51\" (UID: \"6db155e7-c2bd-430d-b59f-895fce359c51\") " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704937 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.704954 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs" (OuterVolumeSpecName: "logs") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.712183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.712863 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5" (OuterVolumeSpecName: "kube-api-access-l6mx5") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "kube-api-access-l6mx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.725331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts" (OuterVolumeSpecName: "scripts") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.745974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.760892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.772865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data" (OuterVolumeSpecName: "config-data") pod "6db155e7-c2bd-430d-b59f-895fce359c51" (UID: "6db155e7-c2bd-430d-b59f-895fce359c51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805882 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805910 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6mx5\" (UniqueName: \"kubernetes.io/projected/6db155e7-c2bd-430d-b59f-895fce359c51-kube-api-access-l6mx5\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805920 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805929 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805937 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805970 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805978 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6db155e7-c2bd-430d-b59f-895fce359c51-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.805986 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6db155e7-c2bd-430d-b59f-895fce359c51-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.821938 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:37:57 crc kubenswrapper[4795]: I0320 17:37:57.907210 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.029092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6db155e7-c2bd-430d-b59f-895fce359c51","Type":"ContainerDied","Data":"a9bb6c0d2645cf89398ba5e524f48e4c91c8aba913f48a27e0f35d07ec1b7929"} Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.029103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.029353 4795 scope.go:117] "RemoveContainer" containerID="806bcf708048026ca5ad090a2d7ea81fda282d90f0b0eee1c86c78da0b398bbc" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.068490 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.079171 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.091138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: E0320 17:37:58.093635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093701 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" Mar 20 17:37:58 crc kubenswrapper[4795]: E0320 17:37:58.093725 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerName="init" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093736 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerName="init" Mar 20 17:37:58 crc kubenswrapper[4795]: E0320 17:37:58.093771 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093779 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.093985 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="48116dbd-882f-4c5e-a8fe-4bea9195e73b" containerName="init" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.094012 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-httpd" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.094026 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" containerName="glance-log" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.095102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.097427 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.098208 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.107264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.211902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.211955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.212217 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.316486 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.318446 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.319250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.327196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.333439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.333844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.334891 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.335920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.355365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.404908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.439146 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.452718 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.453993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.457707 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.468359 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.516309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.520291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.549460 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fb74ddb8-dbrvh"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.560134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb74ddb8-dbrvh"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.560235 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.567738 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.568347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.621668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.622708 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.623390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.623854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.625698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.629776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.640285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.647327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"horizon-698b6ff5c8-7p5rs\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-scripts\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723311 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-config-data\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-logs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-tls-certs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzvp\" (UniqueName: \"kubernetes.io/projected/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-kube-api-access-lvzvp\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-combined-ca-bundle\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.723561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-secret-key\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.776219 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.824871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-tls-certs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzvp\" (UniqueName: \"kubernetes.io/projected/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-kube-api-access-lvzvp\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-combined-ca-bundle\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-secret-key\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-scripts\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-config-data\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-logs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.825788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-logs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.827338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-scripts\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.828153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-config-data\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.830378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-combined-ca-bundle\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.832334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-tls-certs\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.832827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-horizon-secret-key\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.842342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzvp\" (UniqueName: \"kubernetes.io/projected/f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109-kube-api-access-lvzvp\") pod \"horizon-7fb74ddb8-dbrvh\" (UID: \"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109\") " pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:58 crc kubenswrapper[4795]: I0320 17:37:58.902229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:37:59 crc kubenswrapper[4795]: I0320 17:37:59.264265 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db155e7-c2bd-430d-b59f-895fce359c51" path="/var/lib/kubelet/pods/6db155e7-c2bd-430d-b59f-895fce359c51/volumes" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.134459 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.135644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.138018 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.138386 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.139556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.165761 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.267126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"auto-csr-approver-29567138-7flct\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.369470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"auto-csr-approver-29567138-7flct\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.405373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"auto-csr-approver-29567138-7flct\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.472119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.698926 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.749224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:38:00 crc kubenswrapper[4795]: I0320 17:38:00.749665 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" containerID="cri-o://46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481" gracePeriod=10 Mar 20 17:38:02 crc kubenswrapper[4795]: I0320 17:38:02.064123 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed82521a-8a78-4611-870f-5ad53625bddf" containerID="46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481" exitCode=0 Mar 20 17:38:02 crc kubenswrapper[4795]: I0320 17:38:02.064171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerDied","Data":"46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481"} Mar 20 17:38:04 crc kubenswrapper[4795]: I0320 17:38:04.028490 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Mar 20 17:38:07 crc kubenswrapper[4795]: E0320 17:38:07.623466 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:38:07 crc kubenswrapper[4795]: E0320 17:38:07.626094 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h7ch589h555h5fbhd7h97h587h55bh658hc4h656h5d6h656h5ddh669h95h5c6h5b7h598h5d6hd8h59dh646hfbhd6hb9h5d6h8fh5dch5cfh64q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdk4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f84bdc6f9-rj454_openstack(c2e1702a-166c-4c2e-9c39-d32a62528a89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:38:07 crc kubenswrapper[4795]: E0320 17:38:07.629649 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f84bdc6f9-rj454" podUID="c2e1702a-166c-4c2e-9c39-d32a62528a89" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.725164 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.731156 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922830 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.922958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") pod \"67dd868e-24f8-426f-b835-1e92ab4441e6\" (UID: \"67dd868e-24f8-426f-b835-1e92ab4441e6\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923195 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") pod \"e756aad3-09ee-4c1c-b495-7417339f50e5\" (UID: \"e756aad3-09ee-4c1c-b495-7417339f50e5\") " Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923241 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs" (OuterVolumeSpecName: "logs") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.923515 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.924179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.929758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.929762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb" (OuterVolumeSpecName: "kube-api-access-888xb") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "kube-api-access-888xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.929814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.931469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts" (OuterVolumeSpecName: "scripts") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.931568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.946742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts" (OuterVolumeSpecName: "scripts") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.949398 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.951468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data" (OuterVolumeSpecName: "config-data") pod "e756aad3-09ee-4c1c-b495-7417339f50e5" (UID: "e756aad3-09ee-4c1c-b495-7417339f50e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.957232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2" (OuterVolumeSpecName: "kube-api-access-42fh2") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "kube-api-access-42fh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.961632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.983837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4795]: I0320 17:38:07.987679 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data" (OuterVolumeSpecName: "config-data") pod "67dd868e-24f8-426f-b835-1e92ab4441e6" (UID: "67dd868e-24f8-426f-b835-1e92ab4441e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025619 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025665 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025677 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025706 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025716 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67dd868e-24f8-426f-b835-1e92ab4441e6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025724 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025732 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025741 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025749 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-888xb\" (UniqueName: \"kubernetes.io/projected/e756aad3-09ee-4c1c-b495-7417339f50e5-kube-api-access-888xb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025758 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67dd868e-24f8-426f-b835-1e92ab4441e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025781 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42fh2\" (UniqueName: \"kubernetes.io/projected/67dd868e-24f8-426f-b835-1e92ab4441e6-kube-api-access-42fh2\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025789 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e756aad3-09ee-4c1c-b495-7417339f50e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.025821 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.042009 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.121650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6kxnf" event={"ID":"e756aad3-09ee-4c1c-b495-7417339f50e5","Type":"ContainerDied","Data":"6a73d63fdc0ea281653f757d79fef4c1bc3b9c3ec1ea1387b57d51713a411b61"} Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.121729 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6kxnf" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.121734 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a73d63fdc0ea281653f757d79fef4c1bc3b9c3ec1ea1387b57d51713a411b61" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.124649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"67dd868e-24f8-426f-b835-1e92ab4441e6","Type":"ContainerDied","Data":"5feb2d488b80d85ec78d6982b03a6895822f924d5e7cd9f2b1dd64d4c4e88e67"} Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.124814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.127240 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.177825 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.185487 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.201985 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: E0320 17:38:08.202293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" Mar 20 17:38:08 crc kubenswrapper[4795]: E0320 17:38:08.202317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerName="keystone-bootstrap" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202323 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerName="keystone-bootstrap" Mar 20 17:38:08 crc kubenswrapper[4795]: E0320 17:38:08.202335 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202341 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202492 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-log" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202509 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" containerName="glance-httpd" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.202519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" containerName="keystone-bootstrap" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.203302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.208412 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.208602 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.212010 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329979 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.329997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.330021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.330045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.330145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.431846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.431896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.431937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.432854 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.433289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.433324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.433381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.436133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.436302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.436918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.438747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.451051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.462637 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.523845 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.844328 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.851285 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6kxnf"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.960819 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.962050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.964114 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967258 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967362 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.967468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:38:08 crc kubenswrapper[4795]: I0320 17:38:08.970547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.146969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.147031 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.147058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.248580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.252696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.255003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.255033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.255847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.257424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.266320 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dd868e-24f8-426f-b835-1e92ab4441e6" path="/var/lib/kubelet/pods/67dd868e-24f8-426f-b835-1e92ab4441e6/volumes" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.266957 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e756aad3-09ee-4c1c-b495-7417339f50e5" path="/var/lib/kubelet/pods/e756aad3-09ee-4c1c-b495-7417339f50e5/volumes" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.267469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"keystone-bootstrap-qfpzw\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:09 crc kubenswrapper[4795]: I0320 17:38:09.285633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:11 crc kubenswrapper[4795]: I0320 17:38:11.300244 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:38:11 crc kubenswrapper[4795]: I0320 17:38:11.300541 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:38:14 crc kubenswrapper[4795]: I0320 17:38:14.028927 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 17:38:15 crc kubenswrapper[4795]: I0320 17:38:15.844682 4795 scope.go:117] "RemoveContainer" containerID="8156deaf77b9791c001c9273817906ecaad8bb0c5b81a96342d9d6c47a110f46" Mar 20 17:38:16 crc kubenswrapper[4795]: E0320 17:38:16.361404 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 17:38:16 crc kubenswrapper[4795]: E0320 17:38:16.361946 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-692nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-4t68k_openstack(d254abd5-b344-416a-b99d-96737388795e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:38:16 crc kubenswrapper[4795]: E0320 17:38:16.363136 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-4t68k" podUID="d254abd5-b344-416a-b99d-96737388795e" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.497254 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.504239 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588932 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.588988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589014 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589170 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") pod \"c2e1702a-166c-4c2e-9c39-d32a62528a89\" (UID: \"c2e1702a-166c-4c2e-9c39-d32a62528a89\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.589206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") pod \"ed82521a-8a78-4611-870f-5ad53625bddf\" (UID: \"ed82521a-8a78-4611-870f-5ad53625bddf\") " Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.590364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data" (OuterVolumeSpecName: "config-data") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.590860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts" (OuterVolumeSpecName: "scripts") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.590989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs" (OuterVolumeSpecName: "logs") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.595132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.596087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k" (OuterVolumeSpecName: "kube-api-access-fdk4k") pod "c2e1702a-166c-4c2e-9c39-d32a62528a89" (UID: "c2e1702a-166c-4c2e-9c39-d32a62528a89"). InnerVolumeSpecName "kube-api-access-fdk4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.595991 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf" (OuterVolumeSpecName: "kube-api-access-xxcnf") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "kube-api-access-xxcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.635928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config" (OuterVolumeSpecName: "config") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.637486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.640476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.640634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.642494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed82521a-8a78-4611-870f-5ad53625bddf" (UID: "ed82521a-8a78-4611-870f-5ad53625bddf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdk4k\" (UniqueName: \"kubernetes.io/projected/c2e1702a-166c-4c2e-9c39-d32a62528a89-kube-api-access-fdk4k\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690491 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690503 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxcnf\" (UniqueName: \"kubernetes.io/projected/ed82521a-8a78-4611-870f-5ad53625bddf-kube-api-access-xxcnf\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690514 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690526 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690538 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e1702a-166c-4c2e-9c39-d32a62528a89-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690549 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690572 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c2e1702a-166c-4c2e-9c39-d32a62528a89-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690583 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c2e1702a-166c-4c2e-9c39-d32a62528a89-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4795]: I0320 17:38:16.690593 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed82521a-8a78-4611-870f-5ad53625bddf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.201095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" event={"ID":"ed82521a-8a78-4611-870f-5ad53625bddf","Type":"ContainerDied","Data":"4bbbefbb238ecd4d186023f5577494b794cb2bb3b7f5edef795d647b067a660b"} Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.201207 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.203829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f84bdc6f9-rj454" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.208012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f84bdc6f9-rj454" event={"ID":"c2e1702a-166c-4c2e-9c39-d32a62528a89","Type":"ContainerDied","Data":"49b0337f706e9262134a18e9a4347becf3d1ef94349c1c8c94395df30d906dd2"} Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.209878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-4t68k" podUID="d254abd5-b344-416a-b99d-96737388795e" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.288900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.297143 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f84bdc6f9-rj454"] Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.305139 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.311985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g56fp"] Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.625564 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.625994 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4285l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rdxps_openstack(706c47a0-7763-44af-9b14-0e5322a8f2f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:38:17 crc kubenswrapper[4795]: E0320 17:38:17.627449 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rdxps" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.630222 4795 scope.go:117] "RemoveContainer" containerID="9d38db2257f8dc5a90f78951ae37b98e9efaa5b86ae171e097328140d65c32d7" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.870952 4795 scope.go:117] "RemoveContainer" containerID="17b7a209ce67945452fb27836d24f12b589db2a002bdd4918bcb9350ffa18491" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.903851 4795 scope.go:117] "RemoveContainer" containerID="46bc17154a9a8ac331153a8f5a685f3f770ff87ff542f8ade4f4cba000f1d481" Mar 20 17:38:17 crc kubenswrapper[4795]: I0320 17:38:17.936209 4795 scope.go:117] "RemoveContainer" containerID="99144a28a69d3c6fb3d096bda447f3ff1d42028233b58205725db4c507c27464" Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.165304 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3bb3cb2_7e6a_4c4d_9cb9_cd8d6683c109.slice/crio-768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5 WatchSource:0}: Error finding container 768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5: Status 404 returned error can't find the container with id 768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.171517 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb74ddb8-dbrvh"] Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.217131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerStarted","Data":"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.217173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerStarted","Data":"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.217286 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-777644b489-7th7n" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" containerID="cri-o://fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.218810 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-777644b489-7th7n" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" containerID="cri-o://8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.232368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.233963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb74ddb8-dbrvh" event={"ID":"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109","Type":"ContainerStarted","Data":"768cfe68194d37fc422479537b1915f4748384dcf8df7aa52ba261d9397d31b5"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.240822 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerStarted","Data":"8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.243875 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-777644b489-7th7n" podStartSLOduration=3.152344978 podStartE2EDuration="28.243858442s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.301522343 +0000 UTC m=+1214.759553884" lastFinishedPulling="2026-03-20 17:38:16.393035807 +0000 UTC m=+1239.851067348" observedRunningTime="2026-03-20 17:38:18.23899012 +0000 UTC m=+1241.697021661" watchObservedRunningTime="2026-03-20 17:38:18.243858442 +0000 UTC m=+1241.701889983" Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.251517 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fcd89d897-nsn69" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" containerID="cri-o://71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.251951 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerStarted","Data":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.252072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerStarted","Data":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.253068 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fcd89d897-nsn69" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" containerID="cri-o://109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" gracePeriod=30 Mar 20 17:38:18 crc kubenswrapper[4795]: E0320 17:38:18.258364 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-rdxps" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.290007 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nfr5n" podStartSLOduration=3.638139245 podStartE2EDuration="28.289990702s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.741213771 +0000 UTC m=+1215.199245312" lastFinishedPulling="2026-03-20 17:38:16.393065208 +0000 UTC m=+1239.851096769" observedRunningTime="2026-03-20 17:38:18.258021977 +0000 UTC m=+1241.716053508" watchObservedRunningTime="2026-03-20 17:38:18.289990702 +0000 UTC m=+1241.748022243" Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.298488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.318803 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.318821 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4244f6d6_536a_4555_a05b_176d696d427d.slice/crio-f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6 WatchSource:0}: Error finding container f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6: Status 404 returned error can't find the container with id f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.319261 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fcd89d897-nsn69" podStartSLOduration=1.812779688 podStartE2EDuration="26.319252191s" podCreationTimestamp="2026-03-20 17:37:52 +0000 UTC" firstStartedPulling="2026-03-20 17:37:53.184941894 +0000 UTC m=+1216.642973435" lastFinishedPulling="2026-03-20 17:38:17.691414377 +0000 UTC m=+1241.149445938" observedRunningTime="2026-03-20 17:38:18.30810472 +0000 UTC m=+1241.766136261" watchObservedRunningTime="2026-03-20 17:38:18.319252191 +0000 UTC m=+1241.777283732" Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.321812 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e822b2_0b57_4f89_ab29_caeb483457a1.slice/crio-b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4 WatchSource:0}: Error finding container b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4: Status 404 returned error can't find the container with id b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4 Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.335180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:38:18 crc kubenswrapper[4795]: I0320 17:38:18.494708 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:38:18 crc kubenswrapper[4795]: W0320 17:38:18.501708 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e24d4d3_23ba_4ab0_a5af_3a6dfc19c197.slice/crio-763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c WatchSource:0}: Error finding container 763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c: Status 404 returned error can't find the container with id 763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.033612 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-g56fp" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.242144 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.266160 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e1702a-166c-4c2e-9c39-d32a62528a89" path="/var/lib/kubelet/pods/c2e1702a-166c-4c2e-9c39-d32a62528a89/volumes" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.267155 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" path="/var/lib/kubelet/pods/ed82521a-8a78-4611-870f-5ad53625bddf/volumes" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.271873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerStarted","Data":"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.271921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerStarted","Data":"763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.305012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb74ddb8-dbrvh" event={"ID":"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109","Type":"ContainerStarted","Data":"d39fafb68e4cc0a60f4946e801afe6e5a5c3ec76ae4ec0eb5f1dba6501b35e42"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.305068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb74ddb8-dbrvh" event={"ID":"f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109","Type":"ContainerStarted","Data":"f4bf1c555c928a4c546cc74f9c927811c7ebd6235271eebb506ed1f67527a50b"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.330387 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7fb74ddb8-dbrvh" podStartSLOduration=21.330370076 podStartE2EDuration="21.330370076s" podCreationTimestamp="2026-03-20 17:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:19.324039897 +0000 UTC m=+1242.782071438" watchObservedRunningTime="2026-03-20 17:38:19.330370076 +0000 UTC m=+1242.788401607" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.346370 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerStarted","Data":"8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.346410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerStarted","Data":"f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.366965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerStarted","Data":"5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.367012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerStarted","Data":"da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.367022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerStarted","Data":"b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.374474 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-7flct" event={"ID":"e83d2a1a-2b3b-409a-997a-672e322b1d8e","Type":"ContainerStarted","Data":"d0709eba067851036b3ccf0f38eb78dfb3069d88982f3378db1881ed2de27d68"} Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.377751 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qfpzw" podStartSLOduration=11.377726744 podStartE2EDuration="11.377726744s" podCreationTimestamp="2026-03-20 17:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:19.360804402 +0000 UTC m=+1242.818835963" watchObservedRunningTime="2026-03-20 17:38:19.377726744 +0000 UTC m=+1242.835758285" Mar 20 17:38:19 crc kubenswrapper[4795]: I0320 17:38:19.390939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-698b6ff5c8-7p5rs" podStartSLOduration=21.390920528 podStartE2EDuration="21.390920528s" podCreationTimestamp="2026-03-20 17:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:19.38845065 +0000 UTC m=+1242.846482221" watchObservedRunningTime="2026-03-20 17:38:19.390920528 +0000 UTC m=+1242.848952089" Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.406164 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerStarted","Data":"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619"} Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.411256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerStarted","Data":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.411345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerStarted","Data":"0cc2eb8ef99525fb1f871b8b8fb5220a96df925d928238b2cb6ec98c0cfc670e"} Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.442614 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.442592768 podStartE2EDuration="12.442592768s" podCreationTimestamp="2026-03-20 17:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:20.435196035 +0000 UTC m=+1243.893227596" watchObservedRunningTime="2026-03-20 17:38:20.442592768 +0000 UTC m=+1243.900624319" Mar 20 17:38:20 crc kubenswrapper[4795]: I0320 17:38:20.521482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:38:22 crc kubenswrapper[4795]: I0320 17:38:22.521424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.480306 4795 generic.go:334] "Generic (PLEG): container finished" podID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerID="27cb2cc4ca0cf03af5e4f56a72a8901b4a28c70c5abb54e1f86d55c8053dcc74" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.480362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-7flct" event={"ID":"e83d2a1a-2b3b-409a-997a-672e322b1d8e","Type":"ContainerDied","Data":"27cb2cc4ca0cf03af5e4f56a72a8901b4a28c70c5abb54e1f86d55c8053dcc74"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.483284 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerStarted","Data":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.483365 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" containerID="cri-o://ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" gracePeriod=30 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.483382 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" containerID="cri-o://f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" gracePeriod=30 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.487681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.497076 4795 generic.go:334] "Generic (PLEG): container finished" podID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerID="8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.497156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerDied","Data":"8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.501976 4795 generic.go:334] "Generic (PLEG): container finished" podID="4244f6d6-536a-4555-a05b-176d696d427d" containerID="8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.502021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerDied","Data":"8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13"} Mar 20 17:38:23 crc kubenswrapper[4795]: I0320 17:38:23.515533 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.515515166 podStartE2EDuration="25.515515166s" podCreationTimestamp="2026-03-20 17:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:23.510163218 +0000 UTC m=+1246.968194759" watchObservedRunningTime="2026-03-20 17:38:23.515515166 +0000 UTC m=+1246.973546707" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.202151 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.232756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.232841 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.232938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233150 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") pod \"1e38733a-b81f-4fc5-9ef5-22e14c513263\" (UID: \"1e38733a-b81f-4fc5-9ef5-22e14c513263\") " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.233882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.234232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs" (OuterVolumeSpecName: "logs") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.234629 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.234679 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e38733a-b81f-4fc5-9ef5-22e14c513263-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.244086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts" (OuterVolumeSpecName: "scripts") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.246891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r" (OuterVolumeSpecName: "kube-api-access-b5q4r") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "kube-api-access-b5q4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.267932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.292979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.296397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.301802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data" (OuterVolumeSpecName: "config-data") pod "1e38733a-b81f-4fc5-9ef5-22e14c513263" (UID: "1e38733a-b81f-4fc5-9ef5-22e14c513263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337364 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337420 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5q4r\" (UniqueName: \"kubernetes.io/projected/1e38733a-b81f-4fc5-9ef5-22e14c513263-kube-api-access-b5q4r\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337454 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337467 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.337479 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e38733a-b81f-4fc5-9ef5-22e14c513263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.365633 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.439803 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512531 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" exitCode=0 Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512563 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" exitCode=143 Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerDied","Data":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512665 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512698 4795 scope.go:117] "RemoveContainer" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerDied","Data":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.512845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1e38733a-b81f-4fc5-9ef5-22e14c513263","Type":"ContainerDied","Data":"0cc2eb8ef99525fb1f871b8b8fb5220a96df925d928238b2cb6ec98c0cfc670e"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.516487 4795 generic.go:334] "Generic (PLEG): container finished" podID="37537245-d57e-4087-ade6-6c028eb4d137" containerID="43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a" exitCode=0 Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.516713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerDied","Data":"43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a"} Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.546839 4795 scope.go:117] "RemoveContainer" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.583691 4795 scope.go:117] "RemoveContainer" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.587854 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": container with ID starting with f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f not found: ID does not exist" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.587902 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} err="failed to get container status \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": rpc error: code = NotFound desc = could not find container \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": container with ID starting with f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.587927 4795 scope.go:117] "RemoveContainer" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.589553 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": container with ID starting with ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6 not found: ID does not exist" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.589618 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} err="failed to get container status \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": rpc error: code = NotFound desc = could not find container \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": container with ID starting with ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6 not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.589645 4795 scope.go:117] "RemoveContainer" containerID="f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.597832 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f"} err="failed to get container status \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": rpc error: code = NotFound desc = could not find container \"f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f\": container with ID starting with f86bb87639918f36647288ba65b8c52137ee3d0078f6de79c0d9573b8a8f618f not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.597876 4795 scope.go:117] "RemoveContainer" containerID="ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.599713 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6"} err="failed to get container status \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": rpc error: code = NotFound desc = could not find container \"ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6\": container with ID starting with ac5334f03dbcef539e36f408a4001b1f26e9e103c05ff79ab841cb24ca082df6 not found: ID does not exist" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.619693 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.625746 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.637820 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638292 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638308 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638331 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638339 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638361 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638381 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" Mar 20 17:38:24 crc kubenswrapper[4795]: E0320 17:38:24.638403 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="init" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638412 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="init" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638601 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed82521a-8a78-4611-870f-5ad53625bddf" containerName="dnsmasq-dns" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638644 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-httpd" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.638671 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" containerName="glance-log" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.639844 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.645590 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.645898 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.685088 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748273 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.748434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849683 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.849749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.851953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.852573 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.853165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.858047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.859483 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.860074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.867365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.868486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.901142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " pod="openstack/glance-default-external-api-0" Mar 20 17:38:24 crc kubenswrapper[4795]: I0320 17:38:24.977928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:38:25 crc kubenswrapper[4795]: I0320 17:38:25.262216 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e38733a-b81f-4fc5-9ef5-22e14c513263" path="/var/lib/kubelet/pods/1e38733a-b81f-4fc5-9ef5-22e14c513263/volumes" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.152930 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186404 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.186757 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") pod \"4244f6d6-536a-4555-a05b-176d696d427d\" (UID: \"4244f6d6-536a-4555-a05b-176d696d427d\") " Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.196201 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n" (OuterVolumeSpecName: "kube-api-access-g2c2n") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "kube-api-access-g2c2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.196219 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts" (OuterVolumeSpecName: "scripts") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.196608 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.212940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.215385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data" (OuterVolumeSpecName: "config-data") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.232921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4244f6d6-536a-4555-a05b-176d696d427d" (UID: "4244f6d6-536a-4555-a05b-176d696d427d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288770 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288812 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288825 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2c2n\" (UniqueName: \"kubernetes.io/projected/4244f6d6-536a-4555-a05b-176d696d427d-kube-api-access-g2c2n\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288839 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288851 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.288862 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4244f6d6-536a-4555-a05b-176d696d427d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.546751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfpzw" event={"ID":"4244f6d6-536a-4555-a05b-176d696d427d","Type":"ContainerDied","Data":"f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6"} Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.546791 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f38d36a84c781b3d8548b1887105b51c5caf1e6e7a1de9b3b1806147df89e8d6" Mar 20 17:38:26 crc kubenswrapper[4795]: I0320 17:38:26.546844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfpzw" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.353286 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85b996ff68-fdzxg"] Mar 20 17:38:27 crc kubenswrapper[4795]: E0320 17:38:27.353721 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4244f6d6-536a-4555-a05b-176d696d427d" containerName="keystone-bootstrap" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.353734 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4244f6d6-536a-4555-a05b-176d696d427d" containerName="keystone-bootstrap" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.353906 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4244f6d6-536a-4555-a05b-176d696d427d" containerName="keystone-bootstrap" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.354502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.358128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-29kfm" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.358291 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.358461 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.360260 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.360422 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.360567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.384602 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85b996ff68-fdzxg"] Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.433994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-credential-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-config-data\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-combined-ca-bundle\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-public-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-scripts\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434198 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrft\" (UniqueName: \"kubernetes.io/projected/7b20a034-11f6-40ad-9447-32c49f705c07-kube-api-access-flrft\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-fernet-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.434314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-internal-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.535933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-fernet-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-internal-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-credential-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-config-data\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-combined-ca-bundle\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-public-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-scripts\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.537989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flrft\" (UniqueName: \"kubernetes.io/projected/7b20a034-11f6-40ad-9447-32c49f705c07-kube-api-access-flrft\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.541801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-internal-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.542143 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-scripts\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.542494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-public-tls-certs\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.543520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-fernet-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.544542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-credential-keys\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.545773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-config-data\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.555648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b20a034-11f6-40ad-9447-32c49f705c07-combined-ca-bundle\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.567345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flrft\" (UniqueName: \"kubernetes.io/projected/7b20a034-11f6-40ad-9447-32c49f705c07-kube-api-access-flrft\") pod \"keystone-85b996ff68-fdzxg\" (UID: \"7b20a034-11f6-40ad-9447-32c49f705c07\") " pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:27 crc kubenswrapper[4795]: I0320 17:38:27.672878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.525329 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.525394 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.560555 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.574247 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.600716 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.600900 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.778036 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.778404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.779270 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.806140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.807890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.836808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860703 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860773 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") pod \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\" (UID: \"e83d2a1a-2b3b-409a-997a-672e322b1d8e\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.860980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") pod \"78238b29-6bdd-4f77-847e-731c6c785ed9\" (UID: \"78238b29-6bdd-4f77-847e-731c6c785ed9\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.868067 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs" (OuterVolumeSpecName: "logs") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.908661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg" (OuterVolumeSpecName: "kube-api-access-dxdsg") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "kube-api-access-dxdsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.908719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s" (OuterVolumeSpecName: "kube-api-access-ff42s") pod "e83d2a1a-2b3b-409a-997a-672e322b1d8e" (UID: "e83d2a1a-2b3b-409a-997a-672e322b1d8e"). InnerVolumeSpecName "kube-api-access-ff42s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.908749 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.909039 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.921910 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb74ddb8-dbrvh" podUID="f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.922676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.926090 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data" (OuterVolumeSpecName: "config-data") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.927905 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts" (OuterVolumeSpecName: "scripts") pod "78238b29-6bdd-4f77-847e-731c6c785ed9" (UID: "78238b29-6bdd-4f77-847e-731c6c785ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.962409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") pod \"37537245-d57e-4087-ade6-6c028eb4d137\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.962503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") pod \"37537245-d57e-4087-ade6-6c028eb4d137\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.962541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") pod \"37537245-d57e-4087-ade6-6c028eb4d137\" (UID: \"37537245-d57e-4087-ade6-6c028eb4d137\") " Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963028 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78238b29-6bdd-4f77-847e-731c6c785ed9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963054 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff42s\" (UniqueName: \"kubernetes.io/projected/e83d2a1a-2b3b-409a-997a-672e322b1d8e-kube-api-access-ff42s\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963070 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963082 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963094 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdsg\" (UniqueName: \"kubernetes.io/projected/78238b29-6bdd-4f77-847e-731c6c785ed9-kube-api-access-dxdsg\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.963105 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78238b29-6bdd-4f77-847e-731c6c785ed9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:28 crc kubenswrapper[4795]: I0320 17:38:28.966177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx" (OuterVolumeSpecName: "kube-api-access-868zx") pod "37537245-d57e-4087-ade6-6c028eb4d137" (UID: "37537245-d57e-4087-ade6-6c028eb4d137"). InnerVolumeSpecName "kube-api-access-868zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.005492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config" (OuterVolumeSpecName: "config") pod "37537245-d57e-4087-ade6-6c028eb4d137" (UID: "37537245-d57e-4087-ade6-6c028eb4d137"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.015803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37537245-d57e-4087-ade6-6c028eb4d137" (UID: "37537245-d57e-4087-ade6-6c028eb4d137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.064946 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.064974 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-868zx\" (UniqueName: \"kubernetes.io/projected/37537245-d57e-4087-ade6-6c028eb4d137-kube-api-access-868zx\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.064989 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37537245-d57e-4087-ade6-6c028eb4d137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.275635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85b996ff68-fdzxg"] Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.388405 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.619908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.639866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85b996ff68-fdzxg" event={"ID":"7b20a034-11f6-40ad-9447-32c49f705c07","Type":"ContainerStarted","Data":"17fb2772904e9cb1a712ecf983534dfaadc0d44555105dfe0904f9c295f5ae49"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.647620 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nfr5n" event={"ID":"78238b29-6bdd-4f77-847e-731c6c785ed9","Type":"ContainerDied","Data":"88a328785f37d83e7d6391b28b27d2d2c6fdbb3c3985b1505819228d225ea6fa"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.647656 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a328785f37d83e7d6391b28b27d2d2c6fdbb3c3985b1505819228d225ea6fa" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.647732 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nfr5n" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.649746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7mx5b" event={"ID":"37537245-d57e-4087-ade6-6c028eb4d137","Type":"ContainerDied","Data":"c771cb8aaa4f06cd374656dadf93993f0cd60baafd23685f3873cdc0a25a81b2"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.649768 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c771cb8aaa4f06cd374656dadf93993f0cd60baafd23685f3873cdc0a25a81b2" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.650597 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7mx5b" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.660754 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-7flct" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.661191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-7flct" event={"ID":"e83d2a1a-2b3b-409a-997a-672e322b1d8e","Type":"ContainerDied","Data":"d0709eba067851036b3ccf0f38eb78dfb3069d88982f3378db1881ed2de27d68"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.661243 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0709eba067851036b3ccf0f38eb78dfb3069d88982f3378db1881ed2de27d68" Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.663010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerStarted","Data":"79719142974a75aa1ceb9ca03ec61b98a42d47f6e27982f5c5a5e0502981ad81"} Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.936743 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:38:29 crc kubenswrapper[4795]: I0320 17:38:29.961239 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-b9gh7"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061373 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fc784f9bb-wjct6"] Mar 20 17:38:30 crc kubenswrapper[4795]: E0320 17:38:30.061747 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerName="placement-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerName="placement-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: E0320 17:38:30.061785 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerName="oc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061793 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerName="oc" Mar 20 17:38:30 crc kubenswrapper[4795]: E0320 17:38:30.061820 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37537245-d57e-4087-ade6-6c028eb4d137" containerName="neutron-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.061829 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37537245-d57e-4087-ade6-6c028eb4d137" containerName="neutron-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.062091 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37537245-d57e-4087-ade6-6c028eb4d137" containerName="neutron-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.062111 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" containerName="oc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.062130 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" containerName="placement-db-sync" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.083561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.103543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.104286 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qdq8q" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.104431 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.104561 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.108714 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.110580 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc784f9bb-wjct6"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.199987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48841a5b-142c-49d0-8e87-8562f8d1f824-logs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9765z\" (UniqueName: \"kubernetes.io/projected/48841a5b-142c-49d0-8e87-8562f8d1f824-kube-api-access-9765z\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-internal-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-config-data\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-public-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-combined-ca-bundle\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.200164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-scripts\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48841a5b-142c-49d0-8e87-8562f8d1f824-logs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9765z\" (UniqueName: \"kubernetes.io/projected/48841a5b-142c-49d0-8e87-8562f8d1f824-kube-api-access-9765z\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302415 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-internal-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-config-data\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-public-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-combined-ca-bundle\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.302512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-scripts\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.308306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48841a5b-142c-49d0-8e87-8562f8d1f824-logs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.308818 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-internal-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.309959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-public-tls-certs\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.311826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-scripts\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.318207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-config-data\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.318249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48841a5b-142c-49d0-8e87-8562f8d1f824-combined-ca-bundle\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.349769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9765z\" (UniqueName: \"kubernetes.io/projected/48841a5b-142c-49d0-8e87-8562f8d1f824-kube-api-access-9765z\") pod \"placement-fc784f9bb-wjct6\" (UID: \"48841a5b-142c-49d0-8e87-8562f8d1f824\") " pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.365745 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.367119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.406382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.408946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.408980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.409120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.426065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.510688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.510951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.511759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.512789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.533990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"dnsmasq-dns-55f844cf75-t6ztr\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.540310 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.541638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.544899 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.545171 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.545679 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.545815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qbkvb" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.552218 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.612828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.692180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerStarted","Data":"20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf"} Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.694840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerStarted","Data":"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4"} Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.695905 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.695920 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.696826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85b996ff68-fdzxg" event={"ID":"7b20a034-11f6-40ad-9447-32c49f705c07","Type":"ContainerStarted","Data":"01fce90f9912505f0c63c2440dbfb5a9d7dbc1b947372ef3c1ec663fec74a640"} Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.696846 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721161 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.721762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.722481 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-4t68k" podStartSLOduration=2.6786024790000003 podStartE2EDuration="40.722458209s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.741583902 +0000 UTC m=+1215.199615443" lastFinishedPulling="2026-03-20 17:38:29.785439632 +0000 UTC m=+1253.243471173" observedRunningTime="2026-03-20 17:38:30.707631183 +0000 UTC m=+1254.165662734" watchObservedRunningTime="2026-03-20 17:38:30.722458209 +0000 UTC m=+1254.180489750" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.731338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.737291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.739600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.750369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.756011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85b996ff68-fdzxg" podStartSLOduration=3.7559947129999998 podStartE2EDuration="3.755994713s" podCreationTimestamp="2026-03-20 17:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:30.745149032 +0000 UTC m=+1254.203180573" watchObservedRunningTime="2026-03-20 17:38:30.755994713 +0000 UTC m=+1254.214026254" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.839680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"neutron-575df674dd-5xp2t\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:30 crc kubenswrapper[4795]: I0320 17:38:30.899073 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.223173 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.270665 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd" path="/var/lib/kubelet/pods/9adebcfa-43f1-4af6-9a8a-a7b9fd730bdd/volumes" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.328459 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fc784f9bb-wjct6"] Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.458934 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.462115 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.670340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.716102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc784f9bb-wjct6" event={"ID":"48841a5b-142c-49d0-8e87-8562f8d1f824","Type":"ContainerStarted","Data":"d07d6dd3cc2db4cdd1777a7fc98fdc1ca644aa4e69a46449d4895e63be3fa501"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.718939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc784f9bb-wjct6" event={"ID":"48841a5b-142c-49d0-8e87-8562f8d1f824","Type":"ContainerStarted","Data":"faeae9f21a956a7e66e45503fb180190dd7762567a0ecca500272e336de66482"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.736351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerStarted","Data":"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.809477 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.8094548580000005 podStartE2EDuration="7.809454858s" podCreationTimestamp="2026-03-20 17:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:31.771392092 +0000 UTC m=+1255.229423633" watchObservedRunningTime="2026-03-20 17:38:31.809454858 +0000 UTC m=+1255.267486399" Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.815331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerStarted","Data":"b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.834090 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" exitCode=0 Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.836533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerDied","Data":"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.836573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerStarted","Data":"af377878f795441c98f680067ea533216f89a8059101023b972799ab26727a8a"} Mar 20 17:38:31 crc kubenswrapper[4795]: I0320 17:38:31.872213 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rdxps" podStartSLOduration=3.5542913 podStartE2EDuration="41.872193969s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.741348915 +0000 UTC m=+1215.199380456" lastFinishedPulling="2026-03-20 17:38:30.059251584 +0000 UTC m=+1253.517283125" observedRunningTime="2026-03-20 17:38:31.863678031 +0000 UTC m=+1255.321709592" watchObservedRunningTime="2026-03-20 17:38:31.872193969 +0000 UTC m=+1255.330225510" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.863794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerStarted","Data":"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.864201 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.876461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fc784f9bb-wjct6" event={"ID":"48841a5b-142c-49d0-8e87-8562f8d1f824","Type":"ContainerStarted","Data":"35559ff3a8ad2661db60328fffdb89879fc735d0d8b68e088b90d8af8feccac4"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.878515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.878725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.897278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerStarted","Data":"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.897333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerStarted","Data":"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.897345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerStarted","Data":"6b6c595fe74467a83b78631dcaec9938772f82ba49da938ae37e739e51dd0a38"} Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.898228 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" podStartSLOduration=2.8982111120000003 podStartE2EDuration="2.898211112s" podCreationTimestamp="2026-03-20 17:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:32.890947064 +0000 UTC m=+1256.348978605" watchObservedRunningTime="2026-03-20 17:38:32.898211112 +0000 UTC m=+1256.356242653" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.898291 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-649db44647-mrjns"] Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.903267 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.907187 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.907266 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.945908 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-649db44647-mrjns"] Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.947296 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-fc784f9bb-wjct6" podStartSLOduration=2.947280483 podStartE2EDuration="2.947280483s" podCreationTimestamp="2026-03-20 17:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:32.918381175 +0000 UTC m=+1256.376412716" watchObservedRunningTime="2026-03-20 17:38:32.947280483 +0000 UTC m=+1256.405312024" Mar 20 17:38:32 crc kubenswrapper[4795]: I0320 17:38:32.974011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-575df674dd-5xp2t" podStartSLOduration=2.973991533 podStartE2EDuration="2.973991533s" podCreationTimestamp="2026-03-20 17:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:32.961373166 +0000 UTC m=+1256.419404937" watchObservedRunningTime="2026-03-20 17:38:32.973991533 +0000 UTC m=+1256.432023074" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-public-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-internal-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-ovndb-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkjf\" (UniqueName: \"kubernetes.io/projected/5a472785-4467-4c97-93b9-e6f6eff19126-kube-api-access-jjkjf\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082954 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-httpd-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.082985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-combined-ca-bundle\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-internal-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-ovndb-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkjf\" (UniqueName: \"kubernetes.io/projected/5a472785-4467-4c97-93b9-e6f6eff19126-kube-api-access-jjkjf\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-httpd-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-combined-ca-bundle\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-public-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.184932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.197672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-combined-ca-bundle\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.199966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-internal-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.200377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-ovndb-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.200701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.201641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-public-tls-certs\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.205828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a472785-4467-4c97-93b9-e6f6eff19126-httpd-config\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.208248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkjf\" (UniqueName: \"kubernetes.io/projected/5a472785-4467-4c97-93b9-e6f6eff19126-kube-api-access-jjkjf\") pod \"neutron-649db44647-mrjns\" (UID: \"5a472785-4467-4c97-93b9-e6f6eff19126\") " pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.233564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.804558 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-649db44647-mrjns"] Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.921972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649db44647-mrjns" event={"ID":"5a472785-4467-4c97-93b9-e6f6eff19126","Type":"ContainerStarted","Data":"f17f708ce8f6f9465a4b59b4af8b0853d0f9e2263ad31d5a00163b96eb3c8597"} Mar 20 17:38:33 crc kubenswrapper[4795]: I0320 17:38:33.922524 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.935945 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649db44647-mrjns" event={"ID":"5a472785-4467-4c97-93b9-e6f6eff19126","Type":"ContainerStarted","Data":"77223825f1ef15c8abf7e03b971e41ccc6a4baeed09d05bd269cf50328a9cf4c"} Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.936313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-649db44647-mrjns" event={"ID":"5a472785-4467-4c97-93b9-e6f6eff19126","Type":"ContainerStarted","Data":"4326e2335a5d3a14f50bdd5288042c5e10959366a9d9580c26859376549fdce0"} Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.961655 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-649db44647-mrjns" podStartSLOduration=2.961636737 podStartE2EDuration="2.961636737s" podCreationTimestamp="2026-03-20 17:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:34.953013646 +0000 UTC m=+1258.411045207" watchObservedRunningTime="2026-03-20 17:38:34.961636737 +0000 UTC m=+1258.419668278" Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.979393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:38:34 crc kubenswrapper[4795]: I0320 17:38:34.979440 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.016175 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.026451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.948304 4795 generic.go:334] "Generic (PLEG): container finished" podID="d254abd5-b344-416a-b99d-96737388795e" containerID="20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf" exitCode=0 Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.948416 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerDied","Data":"20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf"} Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.950028 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.950079 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:38:35 crc kubenswrapper[4795]: I0320 17:38:35.950093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:38:37 crc kubenswrapper[4795]: I0320 17:38:37.755586 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:38:37 crc kubenswrapper[4795]: I0320 17:38:37.759328 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.777723 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.903003 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb74ddb8-dbrvh" podUID="f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.987528 4795 generic.go:334] "Generic (PLEG): container finished" podID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerID="b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78" exitCode=0 Mar 20 17:38:38 crc kubenswrapper[4795]: I0320 17:38:38.987618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerDied","Data":"b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78"} Mar 20 17:38:40 crc kubenswrapper[4795]: I0320 17:38:40.726836 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:40 crc kubenswrapper[4795]: I0320 17:38:40.878345 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:38:40 crc kubenswrapper[4795]: I0320 17:38:40.879035 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" containerID="cri-o://fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18" gracePeriod=10 Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.300463 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.300529 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.300578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.301369 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.301431 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd" gracePeriod=600 Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.808567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.819282 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940642 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940753 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") pod \"d254abd5-b344-416a-b99d-96737388795e\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") pod \"d254abd5-b344-416a-b99d-96737388795e\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.940905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") pod \"706c47a0-7763-44af-9b14-0e5322a8f2f1\" (UID: \"706c47a0-7763-44af-9b14-0e5322a8f2f1\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.941152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") pod \"d254abd5-b344-416a-b99d-96737388795e\" (UID: \"d254abd5-b344-416a-b99d-96737388795e\") " Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.942218 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/706c47a0-7763-44af-9b14-0e5322a8f2f1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.947010 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d254abd5-b344-416a-b99d-96737388795e" (UID: "d254abd5-b344-416a-b99d-96737388795e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.947044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts" (OuterVolumeSpecName: "scripts") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.952093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl" (OuterVolumeSpecName: "kube-api-access-692nl") pod "d254abd5-b344-416a-b99d-96737388795e" (UID: "d254abd5-b344-416a-b99d-96737388795e"). InnerVolumeSpecName "kube-api-access-692nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.952164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.952412 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l" (OuterVolumeSpecName: "kube-api-access-4285l") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "kube-api-access-4285l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.968645 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d254abd5-b344-416a-b99d-96737388795e" (UID: "d254abd5-b344-416a-b99d-96737388795e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:41 crc kubenswrapper[4795]: I0320 17:38:41.969675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.014971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data" (OuterVolumeSpecName: "config-data") pod "706c47a0-7763-44af-9b14-0e5322a8f2f1" (UID: "706c47a0-7763-44af-9b14-0e5322a8f2f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.015597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4t68k" event={"ID":"d254abd5-b344-416a-b99d-96737388795e","Type":"ContainerDied","Data":"eaa9eee2e882516c5d4ae5df7684d52bf42c7eec92e061674b1b8ad393538f60"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.015646 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaa9eee2e882516c5d4ae5df7684d52bf42c7eec92e061674b1b8ad393538f60" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.015612 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4t68k" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.017995 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd" exitCode=0 Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.018070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.018114 4795 scope.go:117] "RemoveContainer" containerID="f73d86cbf6e4e022d7d0d974b9432014c717a77e5dd4cab7c0869bd72004efbc" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.023499 4795 generic.go:334] "Generic (PLEG): container finished" podID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerID="fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18" exitCode=0 Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.023575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerDied","Data":"fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.026439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rdxps" event={"ID":"706c47a0-7763-44af-9b14-0e5322a8f2f1","Type":"ContainerDied","Data":"d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5"} Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.026472 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d61709fa41e66a085fdbbea917bc1e49782aa85cb38708012e5ebcca3093a1b5" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.026521 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rdxps" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044657 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044721 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4285l\" (UniqueName: \"kubernetes.io/projected/706c47a0-7763-44af-9b14-0e5322a8f2f1-kube-api-access-4285l\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044732 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-692nl\" (UniqueName: \"kubernetes.io/projected/d254abd5-b344-416a-b99d-96737388795e-kube-api-access-692nl\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044741 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044757 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044771 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d254abd5-b344-416a-b99d-96737388795e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044785 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.044797 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/706c47a0-7763-44af-9b14-0e5322a8f2f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.317177 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.451849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452326 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.452469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.456195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f" (OuterVolumeSpecName: "kube-api-access-9j89f") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "kube-api-access-9j89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.506942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.510259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config" (OuterVolumeSpecName: "config") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.512372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: E0320 17:38:42.531305 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0 podName:f79c1ee6-f8b4-485c-ac9e-667a09868206 nodeName:}" failed. No retries permitted until 2026-03-20 17:38:43.031280124 +0000 UTC m=+1266.489311665 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206") : error deleting /var/lib/kubelet/pods/f79c1ee6-f8b4-485c-ac9e-667a09868206/volume-subpaths: remove /var/lib/kubelet/pods/f79c1ee6-f8b4-485c-ac9e-667a09868206/volume-subpaths: no such file or directory Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.531558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555666 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555716 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j89f\" (UniqueName: \"kubernetes.io/projected/f79c1ee6-f8b4-485c-ac9e-667a09868206-kube-api-access-9j89f\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555730 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555741 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:42 crc kubenswrapper[4795]: I0320 17:38:42.555750 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.048744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1"} Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.063743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") pod \"f79c1ee6-f8b4-485c-ac9e-667a09868206\" (UID: \"f79c1ee6-f8b4-485c-ac9e-667a09868206\") " Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.064856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f79c1ee6-f8b4-485c-ac9e-667a09868206" (UID: "f79c1ee6-f8b4-485c-ac9e-667a09868206"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.081079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" event={"ID":"f79c1ee6-f8b4-485c-ac9e-667a09868206","Type":"ContainerDied","Data":"f8b83339bc587b6ce7e9840f042549b74db3594a545a74800d72e3d558d164b1"} Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.081135 4795 scope.go:117] "RemoveContainer" containerID="fb25591a64e281622f2e5d8c32301267612a9141ea5e32d635d846faf3ba4c18" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.081258 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dlkxl" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76977cb5bb-84w8l"] Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125879 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125891 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125905 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="init" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125912 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="init" Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125938 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerName="cinder-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125944 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerName="cinder-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.125955 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d254abd5-b344-416a-b99d-96737388795e" containerName="barbican-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.125960 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d254abd5-b344-416a-b99d-96737388795e" containerName="barbican-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.126129 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" containerName="dnsmasq-dns" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.126141 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d254abd5-b344-416a-b99d-96737388795e" containerName="barbican-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.126159 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" containerName="cinder-db-sync" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.133310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.140196 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qhmpx" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.140519 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.141504 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.149967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerStarted","Data":"7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220"} Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150168 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" containerID="cri-o://3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150316 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150354 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" containerID="cri-o://7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150390 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" containerID="cri-o://845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.150420 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" containerID="cri-o://5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69" gracePeriod=30 Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.170093 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-558cc4f6c9-d6wp7"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.171409 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.176796 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.193366 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f79c1ee6-f8b4-485c-ac9e-667a09868206-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.194362 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76977cb5bb-84w8l"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.207732 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.252039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-558cc4f6c9-d6wp7"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.288987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.648669325 podStartE2EDuration="53.288966437s" podCreationTimestamp="2026-03-20 17:37:50 +0000 UTC" firstStartedPulling="2026-03-20 17:37:51.790881467 +0000 UTC m=+1215.248913008" lastFinishedPulling="2026-03-20 17:38:42.431178569 +0000 UTC m=+1265.889210120" observedRunningTime="2026-03-20 17:38:43.183670199 +0000 UTC m=+1266.641701730" watchObservedRunningTime="2026-03-20 17:38:43.288966437 +0000 UTC m=+1266.746997978" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-logs\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data-custom\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa8c15c-b759-4db8-ac4d-28648a8cfde2-logs\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data-custom\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-combined-ca-bundle\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294544 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-combined-ca-bundle\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294590 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rfn\" (UniqueName: \"kubernetes.io/projected/faa8c15c-b759-4db8-ac4d-28648a8cfde2-kube-api-access-j4rfn\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.294609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sxmm\" (UniqueName: \"kubernetes.io/projected/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-kube-api-access-6sxmm\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.335234 4795 scope.go:117] "RemoveContainer" containerID="65673e010192e9af6a054b2e6fafb5d1f1505b377d27e64bdbfd06c2c8d1a1c2" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.340275 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dlkxl"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-logs\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data-custom\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa8c15c-b759-4db8-ac4d-28648a8cfde2-logs\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data-custom\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406938 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-combined-ca-bundle\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.406981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-combined-ca-bundle\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rfn\" (UniqueName: \"kubernetes.io/projected/faa8c15c-b759-4db8-ac4d-28648a8cfde2-kube-api-access-j4rfn\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sxmm\" (UniqueName: \"kubernetes.io/projected/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-kube-api-access-6sxmm\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.407834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-logs\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.410431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa8c15c-b759-4db8-ac4d-28648a8cfde2-logs\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.426036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-combined-ca-bundle\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.426309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.426618 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-combined-ca-bundle\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.435772 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.437346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.439345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa8c15c-b759-4db8-ac4d-28648a8cfde2-config-data-custom\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.439884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data-custom\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.443626 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rfn\" (UniqueName: \"kubernetes.io/projected/faa8c15c-b759-4db8-ac4d-28648a8cfde2-kube-api-access-j4rfn\") pod \"barbican-keystone-listener-76977cb5bb-84w8l\" (UID: \"faa8c15c-b759-4db8-ac4d-28648a8cfde2\") " pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.455324 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.457556 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.459159 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-config-data\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.460989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sxmm\" (UniqueName: \"kubernetes.io/projected/aa9d179b-7e78-4a37-80aa-3f3f6e7cabea-kube-api-access-6sxmm\") pod \"barbican-worker-558cc4f6c9-d6wp7\" (UID: \"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea\") " pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.464989 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.468829 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.468932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.468840 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m5c4m" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.478028 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.483301 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.505394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.515783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.515901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.515943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.516303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.528548 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.530830 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.532217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.534506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.554725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.569143 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:43 crc kubenswrapper[4795]: E0320 17:38:43.569877 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-g4tvg ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" podUID="b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.584699 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.586662 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.617800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.618987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619213 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.619233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.620792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.621553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.626017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.626303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.626772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.627019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.627398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.637783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.638389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.638697 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.639280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.640664 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.644048 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.644218 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.649532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"dnsmasq-dns-85ff748b95-dsxmf\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.654354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"cinder-scheduler-0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.726893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727500 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.727999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728519 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.728591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.729524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.732877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.733140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.733672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.734068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.734620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.735128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.741896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.753324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.756153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"dnsmasq-dns-5c9776ccc5-s4k94\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.759875 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"barbican-api-6df8664bf8-htftz\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830452 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.830752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.831557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.831793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.833496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.834943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.839204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.850329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.859772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"cinder-api-0\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " pod="openstack/cinder-api-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.862327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.885776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:43 crc kubenswrapper[4795]: I0320 17:38:43.941137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.010535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.082302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76977cb5bb-84w8l"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.108169 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.184677 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-558cc4f6c9-d6wp7"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.185992 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220" exitCode=0 Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186024 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d" exitCode=2 Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186044 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3" exitCode=0 Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.186146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.188478 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.188953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" event={"ID":"faa8c15c-b759-4db8-ac4d-28648a8cfde2","Type":"ContainerStarted","Data":"8f3d7a20354df7d18fc487eec759baf1acbab93bbcbc536df420db590d468521"} Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.203993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.236648 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") pod \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\" (UID: \"b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d\") " Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.237492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.238029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config" (OuterVolumeSpecName: "config") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.238333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.239343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.240224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.242586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg" (OuterVolumeSpecName: "kube-api-access-g4tvg") pod "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" (UID: "b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d"). InnerVolumeSpecName "kube-api-access-g4tvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338183 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338210 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338219 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tvg\" (UniqueName: \"kubernetes.io/projected/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-kube-api-access-g4tvg\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338228 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338236 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.338243 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.352889 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.360948 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.514523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:38:44 crc kubenswrapper[4795]: I0320 17:38:44.600730 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.202562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerStarted","Data":"c8191878b7b3ac40636daf5acafd14c09be03b0e522e52658ab845e1056e98aa"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.205173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerStarted","Data":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.205218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerStarted","Data":"0867819d1c6037d3f5c50bda529eb1c4bf953092deb0683fb9110c469d1d3a7a"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.207088 4795 generic.go:334] "Generic (PLEG): container finished" podID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" exitCode=0 Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.207207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerDied","Data":"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.207232 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerStarted","Data":"d7ccdefaaa93b0e48b444bfb331ca4591ab4806568e7a9f1ee5df6eaa4ff29c6"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerStarted","Data":"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerStarted","Data":"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerStarted","Data":"57c443aca3511d65cb1758f7e520aeb9b66168af6d8e810d947b3d385977aa2d"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.210970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.214206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-dsxmf" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.214460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" event={"ID":"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea","Type":"ContainerStarted","Data":"aa03efdcdb80e6512a64887c9fc6a68c832266c81fee6bc6488136ea3d701040"} Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.283968 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6df8664bf8-htftz" podStartSLOduration=2.283946131 podStartE2EDuration="2.283946131s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:45.247631931 +0000 UTC m=+1268.705663472" watchObservedRunningTime="2026-03-20 17:38:45.283946131 +0000 UTC m=+1268.741977672" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.297085 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79c1ee6-f8b4-485c-ac9e-667a09868206" path="/var/lib/kubelet/pods/f79c1ee6-f8b4-485c-ac9e-667a09868206/volumes" Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.346539 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.420233 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:45 crc kubenswrapper[4795]: I0320 17:38:45.427911 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-dsxmf"] Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.238781 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerID="5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69" exitCode=0 Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.238854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69"} Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241602 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" containerID="cri-o://10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" gracePeriod=30 Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241647 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" containerID="cri-o://a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" gracePeriod=30 Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerStarted","Data":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.241955 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.244895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerStarted","Data":"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93"} Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.245418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.268829 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.268809841 podStartE2EDuration="3.268809841s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:46.257114284 +0000 UTC m=+1269.715145825" watchObservedRunningTime="2026-03-20 17:38:46.268809841 +0000 UTC m=+1269.726841382" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.281863 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" podStartSLOduration=3.281848591 podStartE2EDuration="3.281848591s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:46.279427255 +0000 UTC m=+1269.737458796" watchObservedRunningTime="2026-03-20 17:38:46.281848591 +0000 UTC m=+1269.739880132" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.395222 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.484777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") pod \"5d348053-c6a6-462c-9e8d-5ff55140a554\" (UID: \"5d348053-c6a6-462c-9e8d-5ff55140a554\") " Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.485305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.485669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.492041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h" (OuterVolumeSpecName: "kube-api-access-zz44h") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "kube-api-access-zz44h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.492160 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts" (OuterVolumeSpecName: "scripts") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.518320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.582324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587456 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587502 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587516 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz44h\" (UniqueName: \"kubernetes.io/projected/5d348053-c6a6-462c-9e8d-5ff55140a554-kube-api-access-zz44h\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587525 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587533 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.587542 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d348053-c6a6-462c-9e8d-5ff55140a554-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.625637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data" (OuterVolumeSpecName: "config-data") pod "5d348053-c6a6-462c-9e8d-5ff55140a554" (UID: "5d348053-c6a6-462c-9e8d-5ff55140a554"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:46 crc kubenswrapper[4795]: I0320 17:38:46.688607 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d348053-c6a6-462c-9e8d-5ff55140a554-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.127637 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197300 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.197676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") pod \"296aa0d1-17fa-44da-8868-1ebb0006c417\" (UID: \"296aa0d1-17fa-44da-8868-1ebb0006c417\") " Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.200646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs" (OuterVolumeSpecName: "logs") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.200778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.206223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.206262 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts" (OuterVolumeSpecName: "scripts") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.206696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj" (OuterVolumeSpecName: "kube-api-access-5xzbj") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "kube-api-access-5xzbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.284932 4795 generic.go:334] "Generic (PLEG): container finished" podID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" exitCode=0 Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.284967 4795 generic.go:334] "Generic (PLEG): container finished" podID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" exitCode=143 Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.285094 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.287663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.298420 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d" path="/var/lib/kubelet/pods/b13fc8ca-8012-4cf1-bbe4-b83ff3eb3b8d/volumes" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299609 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/296aa0d1-17fa-44da-8868-1ebb0006c417-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299630 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/296aa0d1-17fa-44da-8868-1ebb0006c417-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299640 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299648 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299658 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.299666 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzbj\" (UniqueName: \"kubernetes.io/projected/296aa0d1-17fa-44da-8868-1ebb0006c417-kube-api-access-5xzbj\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerDied","Data":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerDied","Data":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"296aa0d1-17fa-44da-8868-1ebb0006c417","Type":"ContainerDied","Data":"0867819d1c6037d3f5c50bda529eb1c4bf953092deb0683fb9110c469d1d3a7a"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" event={"ID":"faa8c15c-b759-4db8-ac4d-28648a8cfde2","Type":"ContainerStarted","Data":"56604dff2798710557b3f53431abf9d5766ec2e2e221efea7bc947c8bdd3f969"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" event={"ID":"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea","Type":"ContainerStarted","Data":"8942e26425357dd66aec41f104906e52c4f17443af2003e51e6111a96df8eee3"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" event={"ID":"aa9d179b-7e78-4a37-80aa-3f3f6e7cabea","Type":"ContainerStarted","Data":"ff561ffcc67744b158c99814015cb22e1d7cf4e90c3b0b0118d0d52bff65d9f0"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307474 4795 scope.go:117] "RemoveContainer" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.307523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d348053-c6a6-462c-9e8d-5ff55140a554","Type":"ContainerDied","Data":"d2d85da431f6c738cb4aa9ee890ff4d70deedc277e0a8410951bda0e019d69a8"} Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.342093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data" (OuterVolumeSpecName: "config-data") pod "296aa0d1-17fa-44da-8868-1ebb0006c417" (UID: "296aa0d1-17fa-44da-8868-1ebb0006c417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.369831 4795 scope.go:117] "RemoveContainer" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.378665 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.389996 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.403727 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/296aa0d1-17fa-44da-8868-1ebb0006c417-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.409313 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-558cc4f6c9-d6wp7" podStartSLOduration=2.5117347370000003 podStartE2EDuration="4.40929592s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="2026-03-20 17:38:44.196752416 +0000 UTC m=+1267.654783957" lastFinishedPulling="2026-03-20 17:38:46.094313599 +0000 UTC m=+1269.552345140" observedRunningTime="2026-03-20 17:38:47.376105138 +0000 UTC m=+1270.834136689" watchObservedRunningTime="2026-03-20 17:38:47.40929592 +0000 UTC m=+1270.867327461" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.412766 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.413466 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.413493 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.413512 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.413521 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416187 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416208 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416233 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416256 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416263 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.416293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416300 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416593 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="sg-core" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-central-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="ceilometer-notification-agent" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416634 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api-log" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416644 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" containerName="proxy-httpd" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.416656 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" containerName="cinder-api" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.418612 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.419609 4795 scope.go:117] "RemoveContainer" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.421069 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.420903 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": container with ID starting with a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c not found: ID does not exist" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.421196 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} err="failed to get container status \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": rpc error: code = NotFound desc = could not find container \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": container with ID starting with a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.421240 4795 scope.go:117] "RemoveContainer" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: E0320 17:38:47.423469 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": container with ID starting with 10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17 not found: ID does not exist" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.423503 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} err="failed to get container status \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": rpc error: code = NotFound desc = could not find container \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": container with ID starting with 10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17 not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.423540 4795 scope.go:117] "RemoveContainer" containerID="a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.423791 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.433178 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c"} err="failed to get container status \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": rpc error: code = NotFound desc = could not find container \"a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c\": container with ID starting with a8e63eebac6cd111323a8e1d3f5a10bee226d9f011c82df61c4eaab00f42b07c not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.433225 4795 scope.go:117] "RemoveContainer" containerID="10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.438326 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17"} err="failed to get container status \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": rpc error: code = NotFound desc = could not find container \"10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17\": container with ID starting with 10d67502c23cfe214df8c45343a6e9dd2ff3c535224bf8b1ee5448cb58227d17 not found: ID does not exist" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.438368 4795 scope.go:117] "RemoveContainer" containerID="7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.453230 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.466637 4795 scope.go:117] "RemoveContainer" containerID="845acac18cf9a7e90383c869af15e1e179f275559f11db3566d8f11150b78c3d" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504923 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.504939 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.511048 4795 scope.go:117] "RemoveContainer" containerID="5d540cc1bf447dcf67570fd849a590667f72184381546b2b04f9eddfb973cf69" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.544885 4795 scope.go:117] "RemoveContainer" containerID="3b6b098ddb9cfeee495acedec7b7145d7d7ba5c2f18ba21ad1f2ac7b8c96c1b3" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.607303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.609521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.609777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.614385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.614507 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.618557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.619175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.631551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"ceilometer-0\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.646565 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.660303 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.677195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.678669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.681553 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.682911 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.684353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.687224 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2g97\" (UniqueName: \"kubernetes.io/projected/0b19426b-81a4-4498-9754-948e8b7154d9-kube-api-access-s2g97\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b19426b-81a4-4498-9754-948e8b7154d9-logs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.711965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b19426b-81a4-4498-9754-948e8b7154d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712096 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-scripts\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.712146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.752449 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2g97\" (UniqueName: \"kubernetes.io/projected/0b19426b-81a4-4498-9754-948e8b7154d9-kube-api-access-s2g97\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b19426b-81a4-4498-9754-948e8b7154d9-logs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b19426b-81a4-4498-9754-948e8b7154d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814543 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-scripts\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.814892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b19426b-81a4-4498-9754-948e8b7154d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.815371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b19426b-81a4-4498-9754-948e8b7154d9-logs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.821505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.822127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-scripts\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.822604 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.824211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.825889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.826143 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b19426b-81a4-4498-9754-948e8b7154d9-config-data\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:47 crc kubenswrapper[4795]: I0320 17:38:47.837334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2g97\" (UniqueName: \"kubernetes.io/projected/0b19426b-81a4-4498-9754-948e8b7154d9-kube-api-access-s2g97\") pod \"cinder-api-0\" (UID: \"0b19426b-81a4-4498-9754-948e8b7154d9\") " pod="openstack/cinder-api-0" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.037822 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.205050 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:38:48 crc kubenswrapper[4795]: W0320 17:38:48.315080 4795 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb13fc8ca_8012_4cf1_bbe4_b83ff3eb3b8d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb13fc8ca_8012_4cf1_bbe4_b83ff3eb3b8d.slice: no such file or directory Mar 20 17:38:48 crc kubenswrapper[4795]: W0320 17:38:48.318400 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d348053_c6a6_462c_9e8d_5ff55140a554.slice/crio-7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220.scope WatchSource:0}: Error finding container 7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220: Status 404 returned error can't find the container with id 7dd1bbacbb8a99db17183e42a8adb8b29eb16101c90535b343ed18d423e5c220 Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.319740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"a117c1a1ab5b289c10e51e9701c1262a6f901fb4294e9dacf76136a47b3ab85d"} Mar 20 17:38:48 crc kubenswrapper[4795]: W0320 17:38:48.319846 4795 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296aa0d1_17fa_44da_8868_1ebb0006c417.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296aa0d1_17fa_44da_8868_1ebb0006c417.slice: no such file or directory Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.320938 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.324789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerStarted","Data":"94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c"} Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.330446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" event={"ID":"faa8c15c-b759-4db8-ac4d-28648a8cfde2","Type":"ContainerStarted","Data":"6b2a655141d0067e1003f83ced2c6fba2714974a090bed13757747c825f7e9e2"} Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.343973 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" exitCode=137 Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.346421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerDied","Data":"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012"} Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.361326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76977cb5bb-84w8l" podStartSLOduration=2.548760871 podStartE2EDuration="5.361307329s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="2026-03-20 17:38:44.107991318 +0000 UTC m=+1267.566022859" lastFinishedPulling="2026-03-20 17:38:46.920537776 +0000 UTC m=+1270.378569317" observedRunningTime="2026-03-20 17:38:48.34891343 +0000 UTC m=+1271.806944971" watchObservedRunningTime="2026-03-20 17:38:48.361307329 +0000 UTC m=+1271.819338870" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.788027 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841634 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.841938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") pod \"7a1074ea-5432-46f8-ba74-7c68912c68b6\" (UID: \"7a1074ea-5432-46f8-ba74-7c68912c68b6\") " Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.842393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs" (OuterVolumeSpecName: "logs") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.849926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.884139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data" (OuterVolumeSpecName: "config-data") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.891722 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk" (OuterVolumeSpecName: "kube-api-access-vmwrk") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "kube-api-access-vmwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.932001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts" (OuterVolumeSpecName: "scripts") pod "7a1074ea-5432-46f8-ba74-7c68912c68b6" (UID: "7a1074ea-5432-46f8-ba74-7c68912c68b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944068 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7a1074ea-5432-46f8-ba74-7c68912c68b6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944096 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a1074ea-5432-46f8-ba74-7c68912c68b6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944105 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944113 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmwrk\" (UniqueName: \"kubernetes.io/projected/7a1074ea-5432-46f8-ba74-7c68912c68b6-kube-api-access-vmwrk\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:48 crc kubenswrapper[4795]: I0320 17:38:48.944121 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a1074ea-5432-46f8-ba74-7c68912c68b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.031670 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.146914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.146946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.146977 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") pod \"d149d116-1195-403f-9546-5b79d24e666d\" (UID: \"d149d116-1195-403f-9546-5b79d24e666d\") " Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs" (OuterVolumeSpecName: "logs") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.147919 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d149d116-1195-403f-9546-5b79d24e666d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.150813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.150962 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk" (OuterVolumeSpecName: "kube-api-access-pwnmk") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "kube-api-access-pwnmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.174532 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts" (OuterVolumeSpecName: "scripts") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.175888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data" (OuterVolumeSpecName: "config-data") pod "d149d116-1195-403f-9546-5b79d24e666d" (UID: "d149d116-1195-403f-9546-5b79d24e666d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.253859 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.254403 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d149d116-1195-403f-9546-5b79d24e666d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.254587 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d149d116-1195-403f-9546-5b79d24e666d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.254770 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwnmk\" (UniqueName: \"kubernetes.io/projected/d149d116-1195-403f-9546-5b79d24e666d-kube-api-access-pwnmk\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.273792 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296aa0d1-17fa-44da-8868-1ebb0006c417" path="/var/lib/kubelet/pods/296aa0d1-17fa-44da-8868-1ebb0006c417/volumes" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.274834 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d348053-c6a6-462c-9e8d-5ff55140a554" path="/var/lib/kubelet/pods/5d348053-c6a6-462c-9e8d-5ff55140a554/volumes" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.394068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerStarted","Data":"9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.396998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b19426b-81a4-4498-9754-948e8b7154d9","Type":"ContainerStarted","Data":"7ab088ee63c56a27e095e3cd75fb5a4335bff783c050d610b3df68526323e45b"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.397047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b19426b-81a4-4498-9754-948e8b7154d9","Type":"ContainerStarted","Data":"c7a852dd130a3bede4b58f63940fd0bea93400b031cae542dee56b65c34701a8"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409378 4795 generic.go:334] "Generic (PLEG): container finished" podID="d149d116-1195-403f-9546-5b79d24e666d" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" exitCode=137 Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409751 4795 generic.go:334] "Generic (PLEG): container finished" podID="d149d116-1195-403f-9546-5b79d24e666d" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" exitCode=137 Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerDied","Data":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerDied","Data":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fcd89d897-nsn69" event={"ID":"d149d116-1195-403f-9546-5b79d24e666d","Type":"ContainerDied","Data":"02a106435121a29bb7e883006bb45d54dcf75dcebd8e8d213a1788cfe4f4db42"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.409913 4795 scope.go:117] "RemoveContainer" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.410093 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fcd89d897-nsn69" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422307 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" exitCode=137 Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerDied","Data":"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-777644b489-7th7n" event={"ID":"7a1074ea-5432-46f8-ba74-7c68912c68b6","Type":"ContainerDied","Data":"715dd3d63dfa37a7a756a39bf44ed5e42b3e66055f9f285906dfbd2b63c913d6"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.422497 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-777644b489-7th7n" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.424815 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.583010201 podStartE2EDuration="6.424799639s" podCreationTimestamp="2026-03-20 17:38:43 +0000 UTC" firstStartedPulling="2026-03-20 17:38:44.368322856 +0000 UTC m=+1267.826354397" lastFinishedPulling="2026-03-20 17:38:47.210112274 +0000 UTC m=+1270.668143835" observedRunningTime="2026-03-20 17:38:49.41718589 +0000 UTC m=+1272.875217441" watchObservedRunningTime="2026-03-20 17:38:49.424799639 +0000 UTC m=+1272.882831180" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.425229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.459560 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.476771 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fcd89d897-nsn69"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.485819 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.496823 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-777644b489-7th7n"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.503748 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84776bb8f8-wkk7m"] Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504137 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504152 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504166 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504172 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504190 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504196 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.504217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504378 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504402 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504412 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" containerName="horizon" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.504422 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d149d116-1195-403f-9546-5b79d24e666d" containerName="horizon-log" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.505342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.512254 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.512670 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84776bb8f8-wkk7m"] Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.513187 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-combined-ca-bundle\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-logs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-internal-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.562955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data-custom\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.563047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h7k9\" (UniqueName: \"kubernetes.io/projected/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-kube-api-access-2h7k9\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.563097 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-public-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.563135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.617143 4795 scope.go:117] "RemoveContainer" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-internal-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data-custom\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664895 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h7k9\" (UniqueName: \"kubernetes.io/projected/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-kube-api-access-2h7k9\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-public-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.664962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.665003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-combined-ca-bundle\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.665030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-logs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.665565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-logs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.666883 4795 scope.go:117] "RemoveContainer" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.667718 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": container with ID starting with 109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d not found: ID does not exist" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.667804 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} err="failed to get container status \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": rpc error: code = NotFound desc = could not find container \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": container with ID starting with 109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.667838 4795 scope.go:117] "RemoveContainer" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: E0320 17:38:49.668465 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": container with ID starting with 71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449 not found: ID does not exist" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.668520 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} err="failed to get container status \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": rpc error: code = NotFound desc = could not find container \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": container with ID starting with 71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449 not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.668545 4795 scope.go:117] "RemoveContainer" containerID="109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.669376 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d"} err="failed to get container status \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": rpc error: code = NotFound desc = could not find container \"109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d\": container with ID starting with 109d5f32736ad618bffec37cde69267d03322314b0cff75e3c002e8ae518e86d not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.669402 4795 scope.go:117] "RemoveContainer" containerID="71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.671679 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449"} err="failed to get container status \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": rpc error: code = NotFound desc = could not find container \"71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449\": container with ID starting with 71b90e79181fe6005790a34e6509d3fcbe58ab56127f28fdab170a9735473449 not found: ID does not exist" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.671731 4795 scope.go:117] "RemoveContainer" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.674673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-internal-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.674991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-public-tls-certs\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.675296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data-custom\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.676351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-combined-ca-bundle\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.680894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-config-data\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.684022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h7k9\" (UniqueName: \"kubernetes.io/projected/6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97-kube-api-access-2h7k9\") pod \"barbican-api-84776bb8f8-wkk7m\" (UID: \"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97\") " pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.839513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:49 crc kubenswrapper[4795]: I0320 17:38:49.886653 4795 scope.go:117] "RemoveContainer" containerID="616dc96d2d0585b233d6d56de6ca35d75cb094f9e09720808ac513c2c13b7e20" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.050104 4795 scope.go:117] "RemoveContainer" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.256661 4795 scope.go:117] "RemoveContainer" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" Mar 20 17:38:50 crc kubenswrapper[4795]: E0320 17:38:50.260810 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012\": container with ID starting with 8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012 not found: ID does not exist" containerID="8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.260856 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012"} err="failed to get container status \"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012\": rpc error: code = NotFound desc = could not find container \"8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012\": container with ID starting with 8cb3891914ea1b3f66c610cd2a96da14772b7532f9849a2b5a160d80770e2012 not found: ID does not exist" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.260878 4795 scope.go:117] "RemoveContainer" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" Mar 20 17:38:50 crc kubenswrapper[4795]: E0320 17:38:50.263888 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686\": container with ID starting with fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686 not found: ID does not exist" containerID="fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.263916 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686"} err="failed to get container status \"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686\": rpc error: code = NotFound desc = could not find container \"fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686\": container with ID starting with fc4882231859f73db5359c60baf4cd5636d926188c23b76849b0cdf615b1d686 not found: ID does not exist" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.491555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.533740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b19426b-81a4-4498-9754-948e8b7154d9","Type":"ContainerStarted","Data":"8f9d0c257765e53f7c8c766f6912e7a2acee2cd25b9c5dbd8ab6d4c65c12b918"} Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.533806 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.564319 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.564301958 podStartE2EDuration="3.564301958s" podCreationTimestamp="2026-03-20 17:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:50.559016462 +0000 UTC m=+1274.017048003" watchObservedRunningTime="2026-03-20 17:38:50.564301958 +0000 UTC m=+1274.022333499" Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.813904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84776bb8f8-wkk7m"] Mar 20 17:38:50 crc kubenswrapper[4795]: I0320 17:38:50.929463 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.092403 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.275023 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1074ea-5432-46f8-ba74-7c68912c68b6" path="/var/lib/kubelet/pods/7a1074ea-5432-46f8-ba74-7c68912c68b6/volumes" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.275909 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d149d116-1195-403f-9546-5b79d24e666d" path="/var/lib/kubelet/pods/d149d116-1195-403f-9546-5b79d24e666d/volumes" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84776bb8f8-wkk7m" event={"ID":"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97","Type":"ContainerStarted","Data":"d2bcafbd4b3c6341a57ef7acb3a881cc34da497929dae0864d611cae93dcdfec"} Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84776bb8f8-wkk7m" event={"ID":"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97","Type":"ContainerStarted","Data":"1ad175aaa2232ed6e1ae62f17a7bec2babde25162a97cd846f8f93d181f436af"} Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84776bb8f8-wkk7m" event={"ID":"6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97","Type":"ContainerStarted","Data":"4fc057768ef8270240a33fb541dacaefe17d9040c8ebf8706a7a941e117a701a"} Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.543982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:38:51 crc kubenswrapper[4795]: I0320 17:38:51.546456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.795198 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7fb74ddb8-dbrvh" Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.855496 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.861518 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84776bb8f8-wkk7m" podStartSLOduration=3.861486146 podStartE2EDuration="3.861486146s" podCreationTimestamp="2026-03-20 17:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:51.582226257 +0000 UTC m=+1275.040257798" watchObservedRunningTime="2026-03-20 17:38:52.861486146 +0000 UTC m=+1276.319517727" Mar 20 17:38:52 crc kubenswrapper[4795]: I0320 17:38:52.913392 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.569216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" containerID="cri-o://da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3" gracePeriod=30 Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.569281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" containerID="cri-o://5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb" gracePeriod=30 Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.864459 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:38:53 crc kubenswrapper[4795]: I0320 17:38:53.942487 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.026317 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.027316 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" containerID="cri-o://076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" gracePeriod=10 Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.146510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.514822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.578375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.578892 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.579218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") pod \"7a77884f-5f74-473c-9875-d7afc62ab2f5\" (UID: \"7a77884f-5f74-473c-9875-d7afc62ab2f5\") " Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.604860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq" (OuterVolumeSpecName: "kube-api-access-6csjq") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "kube-api-access-6csjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605056 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" exitCode=0 Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerDied","Data":"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773"} Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-t6ztr" event={"ID":"7a77884f-5f74-473c-9875-d7afc62ab2f5","Type":"ContainerDied","Data":"af377878f795441c98f680067ea533216f89a8059101023b972799ab26727a8a"} Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.605928 4795 scope.go:117] "RemoveContainer" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.623788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerStarted","Data":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.624093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.658216 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.263269974 podStartE2EDuration="7.658198741s" podCreationTimestamp="2026-03-20 17:38:47 +0000 UTC" firstStartedPulling="2026-03-20 17:38:48.379440449 +0000 UTC m=+1271.837471990" lastFinishedPulling="2026-03-20 17:38:53.774369105 +0000 UTC m=+1277.232400757" observedRunningTime="2026-03-20 17:38:54.641558539 +0000 UTC m=+1278.099590080" watchObservedRunningTime="2026-03-20 17:38:54.658198741 +0000 UTC m=+1278.116230282" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.660976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.661746 4795 scope.go:117] "RemoveContainer" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.664708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.680883 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config" (OuterVolumeSpecName: "config") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683112 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683133 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683145 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6csjq\" (UniqueName: \"kubernetes.io/projected/7a77884f-5f74-473c-9875-d7afc62ab2f5-kube-api-access-6csjq\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.683154 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.709631 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.712159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.718862 4795 scope.go:117] "RemoveContainer" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" Mar 20 17:38:54 crc kubenswrapper[4795]: E0320 17:38:54.719261 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773\": container with ID starting with 076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773 not found: ID does not exist" containerID="076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.719289 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773"} err="failed to get container status \"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773\": rpc error: code = NotFound desc = could not find container \"076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773\": container with ID starting with 076c350b972335582daf65b76640952e67d819c274ac9ac75e2937915693e773 not found: ID does not exist" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.719312 4795 scope.go:117] "RemoveContainer" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" Mar 20 17:38:54 crc kubenswrapper[4795]: E0320 17:38:54.719753 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163\": container with ID starting with 4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163 not found: ID does not exist" containerID="4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.719776 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163"} err="failed to get container status \"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163\": rpc error: code = NotFound desc = could not find container \"4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163\": container with ID starting with 4bb82fe8d6f05760ed2bd3beffdbd340d6057a7bb01ce7b0f87c29854578c163 not found: ID does not exist" Mar 20 17:38:54 crc kubenswrapper[4795]: I0320 17:38:54.720193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a77884f-5f74-473c-9875-d7afc62ab2f5" (UID: "7a77884f-5f74-473c-9875-d7afc62ab2f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.794489 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.794523 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a77884f-5f74-473c-9875-d7afc62ab2f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.945786 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:54.952346 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-t6ztr"] Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.268644 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" path="/var/lib/kubelet/pods/7a77884f-5f74-473c-9875-d7afc62ab2f5/volumes" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.490006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.491075 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.630000 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" containerID="cri-o://94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c" gracePeriod=30 Mar 20 17:38:55 crc kubenswrapper[4795]: I0320 17:38:55.630093 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" containerID="cri-o://9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c" gracePeriod=30 Mar 20 17:38:56 crc kubenswrapper[4795]: I0320 17:38:56.641191 4795 generic.go:334] "Generic (PLEG): container finished" podID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerID="9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c" exitCode=0 Mar 20 17:38:56 crc kubenswrapper[4795]: I0320 17:38:56.641443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerDied","Data":"9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c"} Mar 20 17:38:57 crc kubenswrapper[4795]: I0320 17:38:57.655773 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerID="5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb" exitCode=0 Mar 20 17:38:57 crc kubenswrapper[4795]: I0320 17:38:57.655842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerDied","Data":"5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb"} Mar 20 17:38:58 crc kubenswrapper[4795]: I0320 17:38:58.778187 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.351598 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85b996ff68-fdzxg" Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.689133 4795 generic.go:334] "Generic (PLEG): container finished" podID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerID="94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c" exitCode=0 Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.689220 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerDied","Data":"94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c"} Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.949043 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992829 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.992906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") pod \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\" (UID: \"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0\") " Mar 20 17:38:59 crc kubenswrapper[4795]: I0320 17:38:59.996758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.005886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.010860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg" (OuterVolumeSpecName: "kube-api-access-b6gqg") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "kube-api-access-b6gqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.021296 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts" (OuterVolumeSpecName: "scripts") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.094802 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6gqg\" (UniqueName: \"kubernetes.io/projected/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-kube-api-access-b6gqg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.095083 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.095093 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.095101 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.117942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.157180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data" (OuterVolumeSpecName: "config-data") pod "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" (UID: "0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.183040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.197046 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.197074 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.698167 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0","Type":"ContainerDied","Data":"c8191878b7b3ac40636daf5acafd14c09be03b0e522e52658ab845e1056e98aa"} Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.698213 4795 scope.go:117] "RemoveContainer" containerID="9a3186bfb24fbd40e158c8315de5c6475f512255ad1c85538a25451c575cc22c" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.698326 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.725356 4795 scope.go:117] "RemoveContainer" containerID="94339a3fab4acea27dea3041f96934ac4f22ad61715ed8e250425566cfb0f82c" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.736176 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.750961 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.778799 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779208 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779221 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779237 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779244 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" Mar 20 17:39:00 crc kubenswrapper[4795]: E0320 17:39:00.779296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="init" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="init" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779464 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="probe" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779473 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" containerName="cinder-scheduler" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.779480 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a77884f-5f74-473c-9875-d7afc62ab2f5" containerName="dnsmasq-dns" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.780361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.783354 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.805742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmd5w\" (UniqueName: \"kubernetes.io/projected/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-kube-api-access-gmd5w\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.807926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-scripts\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-scripts\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmd5w\" (UniqueName: \"kubernetes.io/projected/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-kube-api-access-gmd5w\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.909738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.911009 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.914634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.918467 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.919245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.919619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-scripts\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:00 crc kubenswrapper[4795]: I0320 17:39:00.947531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmd5w\" (UniqueName: \"kubernetes.io/projected/8eeb13a5-ef36-44eb-9dfd-7798e9ad1620-kube-api-access-gmd5w\") pod \"cinder-scheduler-0\" (UID: \"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620\") " pod="openstack/cinder-scheduler-0" Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.105198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.269523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0" path="/var/lib/kubelet/pods/0b40bb76-eaf8-4e7a-bc58-f0dabc025ac0/volumes" Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.644322 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:39:01 crc kubenswrapper[4795]: I0320 17:39:01.709713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620","Type":"ContainerStarted","Data":"fc2d0c41a206352f47f3afd4c5dfbcbe68ff3f14b45245d3886e49cda2910357"} Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.004975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.179471 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.182187 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-fc784f9bb-wjct6" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.257154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84776bb8f8-wkk7m" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.323285 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.323552 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" containerID="cri-o://390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" gracePeriod=30 Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.323940 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" containerID="cri-o://195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" gracePeriod=30 Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.463937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.466243 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.471133 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.471318 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s2zd2" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.471430 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.485807 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.653587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.654030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.654073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6b2\" (UniqueName: \"kubernetes.io/projected/cf3f8aea-393e-418a-ad14-2848c8df93e9-kube-api-access-2t6b2\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.654132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.725738 4795 generic.go:334] "Generic (PLEG): container finished" podID="a4150989-c1d2-4afd-b815-cda32fec2835" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" exitCode=143 Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.725794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerDied","Data":"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16"} Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.728590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620","Type":"ContainerStarted","Data":"83913acd00457440fcb89cd92b47bfecfaf55dcff6920c9989fe7d6025e926e9"} Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6b2\" (UniqueName: \"kubernetes.io/projected/cf3f8aea-393e-418a-ad14-2848c8df93e9-kube-api-access-2t6b2\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.756404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.758049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.773418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6b2\" (UniqueName: \"kubernetes.io/projected/cf3f8aea-393e-418a-ad14-2848c8df93e9-kube-api-access-2t6b2\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.784798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.787756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cf3f8aea-393e-418a-ad14-2848c8df93e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"cf3f8aea-393e-418a-ad14-2848c8df93e9\") " pod="openstack/openstackclient" Mar 20 17:39:02 crc kubenswrapper[4795]: I0320 17:39:02.809951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.248820 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-649db44647-mrjns" Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.313676 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.314212 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-575df674dd-5xp2t" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" containerID="cri-o://d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" gracePeriod=30 Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.314350 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-575df674dd-5xp2t" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" containerID="cri-o://41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" gracePeriod=30 Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.353368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.737962 4795 generic.go:334] "Generic (PLEG): container finished" podID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" exitCode=0 Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.738043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerDied","Data":"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476"} Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.739948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8eeb13a5-ef36-44eb-9dfd-7798e9ad1620","Type":"ContainerStarted","Data":"b696d40be59c077e289a843dab61408c60fd455055ae2c79762077e7b82d0ff7"} Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.741495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cf3f8aea-393e-418a-ad14-2848c8df93e9","Type":"ContainerStarted","Data":"16a9c60148a3b16e42c123935815cea94f189aeb0c36189715acf27ac5f72666"} Mar 20 17:39:03 crc kubenswrapper[4795]: I0320 17:39:03.762360 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.762343726 podStartE2EDuration="3.762343726s" podCreationTimestamp="2026-03-20 17:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:03.760048864 +0000 UTC m=+1287.218080405" watchObservedRunningTime="2026-03-20 17:39:03.762343726 +0000 UTC m=+1287.220375267" Mar 20 17:39:05 crc kubenswrapper[4795]: I0320 17:39:05.797485 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:34676->10.217.0.166:9311: read: connection reset by peer" Mar 20 17:39:05 crc kubenswrapper[4795]: I0320 17:39:05.799223 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6df8664bf8-htftz" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:34674->10.217.0.166:9311: read: connection reset by peer" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.105830 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.225794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349819 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.349921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") pod \"a4150989-c1d2-4afd-b815-cda32fec2835\" (UID: \"a4150989-c1d2-4afd-b815-cda32fec2835\") " Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.351983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs" (OuterVolumeSpecName: "logs") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.358743 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.379704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5" (OuterVolumeSpecName: "kube-api-access-5xvp5") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "kube-api-access-5xvp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.388882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.410784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data" (OuterVolumeSpecName: "config-data") pod "a4150989-c1d2-4afd-b815-cda32fec2835" (UID: "a4150989-c1d2-4afd-b815-cda32fec2835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.455877 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456007 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456035 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4150989-c1d2-4afd-b815-cda32fec2835-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456046 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4150989-c1d2-4afd-b815-cda32fec2835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.456055 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xvp5\" (UniqueName: \"kubernetes.io/projected/a4150989-c1d2-4afd-b815-cda32fec2835-kube-api-access-5xvp5\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776099 4795 generic.go:334] "Generic (PLEG): container finished" podID="a4150989-c1d2-4afd-b815-cda32fec2835" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" exitCode=0 Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerDied","Data":"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805"} Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6df8664bf8-htftz" event={"ID":"a4150989-c1d2-4afd-b815-cda32fec2835","Type":"ContainerDied","Data":"57c443aca3511d65cb1758f7e520aeb9b66168af6d8e810d947b3d385977aa2d"} Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776207 4795 scope.go:117] "RemoveContainer" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.776320 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6df8664bf8-htftz" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.811028 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.816262 4795 scope.go:117] "RemoveContainer" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.818051 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6df8664bf8-htftz"] Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.842900 4795 scope.go:117] "RemoveContainer" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.843413 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805\": container with ID starting with 195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805 not found: ID does not exist" containerID="195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.843459 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805"} err="failed to get container status \"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805\": rpc error: code = NotFound desc = could not find container \"195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805\": container with ID starting with 195f48d45251a1bca8c091cae34aa536295a433f617906e4aec1e8dd1a5ee805 not found: ID does not exist" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.843488 4795 scope.go:117] "RemoveContainer" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.843838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16\": container with ID starting with 390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16 not found: ID does not exist" containerID="390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.843878 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16"} err="failed to get container status \"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16\": rpc error: code = NotFound desc = could not find container \"390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16\": container with ID starting with 390031c6068f55528f08f70db08b4568ec93e87dd97526a9346154dbe44dde16 not found: ID does not exist" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.977694 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6697f55ff5-fj55x"] Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.978017 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" Mar 20 17:39:06 crc kubenswrapper[4795]: E0320 17:39:06.978055 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978062 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978229 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api-log" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.978249 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" containerName="barbican-api" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.983152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.987665 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.999430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 17:39:06 crc kubenswrapper[4795]: I0320 17:39:06.999613 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.006607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6697f55ff5-fj55x"] Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-public-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-run-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-config-data\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxnw\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-kube-api-access-ndxnw\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-log-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-etc-swift\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-combined-ca-bundle\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.167873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-internal-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.262254 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4150989-c1d2-4afd-b815-cda32fec2835" path="/var/lib/kubelet/pods/a4150989-c1d2-4afd-b815-cda32fec2835/volumes" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269471 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-config-data\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxnw\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-kube-api-access-ndxnw\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-log-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269588 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-etc-swift\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-combined-ca-bundle\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-internal-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-public-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.269727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-run-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.270070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-log-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.270986 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-run-httpd\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.276604 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-config-data\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.288648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-etc-swift\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.289080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-combined-ca-bundle\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.290192 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-internal-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.294373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxnw\" (UniqueName: \"kubernetes.io/projected/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-kube-api-access-ndxnw\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.294677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6-public-tls-certs\") pod \"swift-proxy-6697f55ff5-fj55x\" (UID: \"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6\") " pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.305095 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.619401 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620193 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" containerID="cri-o://bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620282 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" containerID="cri-o://b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620305 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" containerID="cri-o://e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.620337 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" containerID="cri-o://807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" gracePeriod=30 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.634785 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.796010 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" exitCode=2 Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.796156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} Mar 20 17:39:07 crc kubenswrapper[4795]: I0320 17:39:07.855761 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6697f55ff5-fj55x"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.444761 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.611557 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") pod \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\" (UID: \"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c\") " Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.612323 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.612584 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.615342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b" (OuterVolumeSpecName: "kube-api-access-vrr2b") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "kube-api-access-vrr2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.617810 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts" (OuterVolumeSpecName: "scripts") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.645050 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.689256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.710129 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data" (OuterVolumeSpecName: "config-data") pod "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" (UID: "f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713206 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713859 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713881 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713893 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrr2b\" (UniqueName: \"kubernetes.io/projected/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-kube-api-access-vrr2b\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713922 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713931 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.713939 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.779051 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810022 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" exitCode=0 Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810054 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" exitCode=0 Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810066 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" exitCode=0 Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810097 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c","Type":"ContainerDied","Data":"a117c1a1ab5b289c10e51e9701c1262a6f901fb4294e9dacf76136a47b3ab85d"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.810251 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6697f55ff5-fj55x" event={"ID":"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6","Type":"ContainerStarted","Data":"cecdd00c873c6d628dd822fd5569140a0d1f2830367ac85f8e835e186166f659"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6697f55ff5-fj55x" event={"ID":"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6","Type":"ContainerStarted","Data":"c8a0719b6dee041d8bdac2720159d6c04c39cca376780f0e5f7462217dfff21b"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6697f55ff5-fj55x" event={"ID":"e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6","Type":"ContainerStarted","Data":"1839cb46b9a07d6e5f4fb99ff3cc20b86bec8297ab56d804ac648323238e522a"} Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.812722 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.847569 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6697f55ff5-fj55x" podStartSLOduration=2.847549993 podStartE2EDuration="2.847549993s" podCreationTimestamp="2026-03-20 17:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:08.83056333 +0000 UTC m=+1292.288594871" watchObservedRunningTime="2026-03-20 17:39:08.847549993 +0000 UTC m=+1292.305581534" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.847734 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.872663 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.885802 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.887433 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902128 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902586 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902602 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902616 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902622 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902676 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.902699 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902705 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902868 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="proxy-httpd" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902889 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="sg-core" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902903 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-central-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.902914 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" containerName="ceilometer-notification-agent" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.904594 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.909395 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.909719 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.913565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.965798 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.985264 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.985777 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.985826 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} err="failed to get container status \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.985875 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.986351 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986392 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} err="failed to get container status \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986419 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.986736 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986779 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} err="failed to get container status \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.986795 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: E0320 17:39:08.987178 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987204 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} err="failed to get container status \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987220 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987439 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} err="failed to get container status \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.987458 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.998638 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} err="failed to get container status \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.998672 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999113 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} err="failed to get container status \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999144 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999515 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} err="failed to get container status \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999538 4795 scope.go:117] "RemoveContainer" containerID="807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999767 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6"} err="failed to get container status \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": rpc error: code = NotFound desc = could not find container \"807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6\": container with ID starting with 807a9f0e3cd9179412396f8b4f20b2d3802bff7738d2657c034779d4aaf675e6 not found: ID does not exist" Mar 20 17:39:08 crc kubenswrapper[4795]: I0320 17:39:08.999793 4795 scope.go:117] "RemoveContainer" containerID="b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000048 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83"} err="failed to get container status \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": rpc error: code = NotFound desc = could not find container \"b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83\": container with ID starting with b164c0a9f5cf2701fc3005675940b0862f1a845d3a38f04989d59867a5a9ff83 not found: ID does not exist" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000069 4795 scope.go:117] "RemoveContainer" containerID="e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000343 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f"} err="failed to get container status \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": rpc error: code = NotFound desc = could not find container \"e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f\": container with ID starting with e33e43875033c51056271802a2793274979726c4efb91edfd62404163b50cc9f not found: ID does not exist" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000447 4795 scope.go:117] "RemoveContainer" containerID="bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.000658 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa"} err="failed to get container status \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": rpc error: code = NotFound desc = could not find container \"bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa\": container with ID starting with bbe3e141179f6ec4a629d011c7e5392f5a26a960fb0feff37622d41ac0427ffa not found: ID does not exist" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018078 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.018301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.119542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.120028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.122210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.135204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.136476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.142197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.142246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.147842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.257230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.263333 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c" path="/var/lib/kubelet/pods/f2f41d40-5b8b-48dc-bbc3-c0dcb84bdb5c/volumes" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.521908 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.631907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.631966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.632031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.632063 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.632109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") pod \"8cce4da2-83af-4f8a-9923-d618bd8a9225\" (UID: \"8cce4da2-83af-4f8a-9923-d618bd8a9225\") " Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.642913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h" (OuterVolumeSpecName: "kube-api-access-h2x7h") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "kube-api-access-h2x7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.645966 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.699188 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config" (OuterVolumeSpecName: "config") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.717888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.733969 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2x7h\" (UniqueName: \"kubernetes.io/projected/8cce4da2-83af-4f8a-9923-d618bd8a9225-kube-api-access-h2x7h\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.734000 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.734010 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.734018 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.735389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8cce4da2-83af-4f8a-9923-d618bd8a9225" (UID: "8cce4da2-83af-4f8a-9923-d618bd8a9225"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.839427 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cce4da2-83af-4f8a-9923-d618bd8a9225-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.841041 4795 generic.go:334] "Generic (PLEG): container finished" podID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" exitCode=0 Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842207 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-575df674dd-5xp2t" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerDied","Data":"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a"} Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842792 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-575df674dd-5xp2t" event={"ID":"8cce4da2-83af-4f8a-9923-d618bd8a9225","Type":"ContainerDied","Data":"6b6c595fe74467a83b78631dcaec9938772f82ba49da938ae37e739e51dd0a38"} Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.842831 4795 scope.go:117] "RemoveContainer" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.844139 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.889757 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:39:09 crc kubenswrapper[4795]: I0320 17:39:09.899657 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-575df674dd-5xp2t"] Mar 20 17:39:11 crc kubenswrapper[4795]: I0320 17:39:11.271285 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" path="/var/lib/kubelet/pods/8cce4da2-83af-4f8a-9923-d618bd8a9225/volumes" Mar 20 17:39:11 crc kubenswrapper[4795]: I0320 17:39:11.337460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.785662 4795 scope.go:117] "RemoveContainer" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.845447 4795 scope.go:117] "RemoveContainer" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" Mar 20 17:39:14 crc kubenswrapper[4795]: E0320 17:39:14.845981 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476\": container with ID starting with 41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476 not found: ID does not exist" containerID="41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.846031 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476"} err="failed to get container status \"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476\": rpc error: code = NotFound desc = could not find container \"41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476\": container with ID starting with 41de64cf9e2d2e8a6001fe4e3cbdd23009cc6450f9347e8fcd2d2050661cf476 not found: ID does not exist" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.846064 4795 scope.go:117] "RemoveContainer" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" Mar 20 17:39:14 crc kubenswrapper[4795]: E0320 17:39:14.846391 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a\": container with ID starting with d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a not found: ID does not exist" containerID="d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.846433 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a"} err="failed to get container status \"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a\": rpc error: code = NotFound desc = could not find container \"d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a\": container with ID starting with d222b18782e5d372983e796d8f020bcf245c7b820bbda28f9ff00e6c844f0a9a not found: ID does not exist" Mar 20 17:39:14 crc kubenswrapper[4795]: I0320 17:39:14.909039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"7d09de57ae215285b7a1c023830c93b72b04b973baabd25b25f676f4a51305aa"} Mar 20 17:39:15 crc kubenswrapper[4795]: I0320 17:39:15.920930 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"cf3f8aea-393e-418a-ad14-2848c8df93e9","Type":"ContainerStarted","Data":"b7ca66e3e1d5493aad87395606a801a678c10a32acb9500132879e8bdf155903"} Mar 20 17:39:15 crc kubenswrapper[4795]: I0320 17:39:15.924544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} Mar 20 17:39:15 crc kubenswrapper[4795]: I0320 17:39:15.944755 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.470058321 podStartE2EDuration="13.944734328s" podCreationTimestamp="2026-03-20 17:39:02 +0000 UTC" firstStartedPulling="2026-03-20 17:39:03.375664348 +0000 UTC m=+1286.833695889" lastFinishedPulling="2026-03-20 17:39:14.850340355 +0000 UTC m=+1298.308371896" observedRunningTime="2026-03-20 17:39:15.936826108 +0000 UTC m=+1299.394857659" watchObservedRunningTime="2026-03-20 17:39:15.944734328 +0000 UTC m=+1299.402765869" Mar 20 17:39:16 crc kubenswrapper[4795]: I0320 17:39:16.933746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} Mar 20 17:39:16 crc kubenswrapper[4795]: I0320 17:39:16.934798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.313615 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:39:17 crc kubenswrapper[4795]: E0320 17:39:17.314004 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314021 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" Mar 20 17:39:17 crc kubenswrapper[4795]: E0320 17:39:17.314043 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314051 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314216 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-httpd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314240 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cce4da2-83af-4f8a-9923-d618bd8a9225" containerName="neutron-api" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.314853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.317123 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.326572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.347389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6697f55ff5-fj55x" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.387325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.388022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.420524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.421597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.431892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.433028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.434818 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.449835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.461732 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.489621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.489856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.489941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.490999 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.507429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"nova-api-db-create-4l89c\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.591771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.592368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.592699 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.607031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.614486 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.616228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"nova-cell0-db-create-fqwkd\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.649936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"nova-api-7051-account-create-update-d2d7p\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.655760 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.661251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.697032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.697145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.720011 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.722057 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.731087 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.731296 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.744565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.753411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.800636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.800940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.800969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.801022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.801455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.838673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"nova-cell1-db-create-wd5n7\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.851730 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.852827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.858877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.869575 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.902729 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.903328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:17 crc kubenswrapper[4795]: I0320 17:39:17.942447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"nova-cell0-100d-account-create-update-7l925\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.006386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.007365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.037246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.043258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.049167 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.072135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"nova-cell1-3748-account-create-update-j2khv\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.210311 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.293341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.357262 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.374995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:39:18 crc kubenswrapper[4795]: W0320 17:39:18.385436 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1dfe60_98b0_4644_b063_831293f9bd5c.slice/crio-23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e WatchSource:0}: Error finding container 23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e: Status 404 returned error can't find the container with id 23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.595864 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.612706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.689157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:39:18 crc kubenswrapper[4795]: W0320 17:39:18.707613 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a51797_b6d0_4b5b_9927_54d4b965469e.slice/crio-4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248 WatchSource:0}: Error finding container 4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248: Status 404 returned error can't find the container with id 4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.778494 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-698b6ff5c8-7p5rs" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.778720 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.954672 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerID="f4b5007e4a1309d08572b5c31e2719d5a9d1e8abc1f29797304920c21729de14" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.954727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fqwkd" event={"ID":"7d1dfe60-98b0-4644-b063-831293f9bd5c","Type":"ContainerDied","Data":"f4b5007e4a1309d08572b5c31e2719d5a9d1e8abc1f29797304920c21729de14"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.955104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fqwkd" event={"ID":"7d1dfe60-98b0-4644-b063-831293f9bd5c","Type":"ContainerStarted","Data":"23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.956064 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wd5n7" event={"ID":"7bd6f80f-7908-42b5-b32a-63d585bd9194","Type":"ContainerStarted","Data":"970949fc704fe767a632d55a42c62ddb8f7a120a1f8f4ea713ca42632765cd3e"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.958553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerStarted","Data":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.958756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.964864 4795 generic.go:334] "Generic (PLEG): container finished" podID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerID="b63d10a82b890eac4b6bd4726e08b48ae844a0390be4307858d97c75a41d914f" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.965065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4l89c" event={"ID":"efc90399-0b15-4fc6-b441-d7df6925c8aa","Type":"ContainerDied","Data":"b63d10a82b890eac4b6bd4726e08b48ae844a0390be4307858d97c75a41d914f"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.965108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4l89c" event={"ID":"efc90399-0b15-4fc6-b441-d7df6925c8aa","Type":"ContainerStarted","Data":"700d362b8c96896ed216b0973f95f5cdb189f63903ab165abd13610fbb69975d"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.970098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3748-account-create-update-j2khv" event={"ID":"9f5daae9-920d-496a-ad6a-c016cfb82250","Type":"ContainerStarted","Data":"bf45615034cdb988e7a4ea9c726e6ec289a16729b8bfc6970ea397d199a8e5a3"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.974610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-100d-account-create-update-7l925" event={"ID":"65a51797-b6d0-4b5b-9927-54d4b965469e","Type":"ContainerStarted","Data":"4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.982662 4795 generic.go:334] "Generic (PLEG): container finished" podID="e42b654e-f003-45dd-a7c4-07655514643e" containerID="d5e2a993de0ac2a73d513cdc5305eaa4c6be7243356c29c7534462e994b17675" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.982715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7051-account-create-update-d2d7p" event={"ID":"e42b654e-f003-45dd-a7c4-07655514643e","Type":"ContainerDied","Data":"d5e2a993de0ac2a73d513cdc5305eaa4c6be7243356c29c7534462e994b17675"} Mar 20 17:39:18 crc kubenswrapper[4795]: I0320 17:39:18.982737 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7051-account-create-update-d2d7p" event={"ID":"e42b654e-f003-45dd-a7c4-07655514643e","Type":"ContainerStarted","Data":"776a14a7a36db1bed300198a0a46287925190b675f42aee4a98e85192fcad1df"} Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.029527 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.246106239 podStartE2EDuration="11.029510868s" podCreationTimestamp="2026-03-20 17:39:08 +0000 UTC" firstStartedPulling="2026-03-20 17:39:14.786274253 +0000 UTC m=+1298.244305834" lastFinishedPulling="2026-03-20 17:39:18.569678932 +0000 UTC m=+1302.027710463" observedRunningTime="2026-03-20 17:39:19.013490794 +0000 UTC m=+1302.471522335" watchObservedRunningTime="2026-03-20 17:39:19.029510868 +0000 UTC m=+1302.487542409" Mar 20 17:39:19 crc kubenswrapper[4795]: E0320 17:39:19.602383 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a51797_b6d0_4b5b_9927_54d4b965469e.slice/crio-conmon-43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.992539 4795 generic.go:334] "Generic (PLEG): container finished" podID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerID="43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951" exitCode=0 Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.992607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-100d-account-create-update-7l925" event={"ID":"65a51797-b6d0-4b5b-9927-54d4b965469e","Type":"ContainerDied","Data":"43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951"} Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.994427 4795 generic.go:334] "Generic (PLEG): container finished" podID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerID="21aac3ceb6dfd938908085675b810b1f95e9aaa0d7afc715430454788951ca0a" exitCode=0 Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.994511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wd5n7" event={"ID":"7bd6f80f-7908-42b5-b32a-63d585bd9194","Type":"ContainerDied","Data":"21aac3ceb6dfd938908085675b810b1f95e9aaa0d7afc715430454788951ca0a"} Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.996635 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerID="1a5ed57a211fe9b0c1882f91516bfc8da29711316e391c1c87ed18df2cb6cc36" exitCode=0 Mar 20 17:39:19 crc kubenswrapper[4795]: I0320 17:39:19.996760 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3748-account-create-update-j2khv" event={"ID":"9f5daae9-920d-496a-ad6a-c016cfb82250","Type":"ContainerDied","Data":"1a5ed57a211fe9b0c1882f91516bfc8da29711316e391c1c87ed18df2cb6cc36"} Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.415185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.461859 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.462155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") pod \"efc90399-0b15-4fc6-b441-d7df6925c8aa\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.462199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") pod \"efc90399-0b15-4fc6-b441-d7df6925c8aa\" (UID: \"efc90399-0b15-4fc6-b441-d7df6925c8aa\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.465166 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efc90399-0b15-4fc6-b441-d7df6925c8aa" (UID: "efc90399-0b15-4fc6-b441-d7df6925c8aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.480358 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9" (OuterVolumeSpecName: "kube-api-access-xkgh9") pod "efc90399-0b15-4fc6-b441-d7df6925c8aa" (UID: "efc90399-0b15-4fc6-b441-d7df6925c8aa"). InnerVolumeSpecName "kube-api-access-xkgh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.523279 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.527506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.564903 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc90399-0b15-4fc6-b441-d7df6925c8aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.564944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkgh9\" (UniqueName: \"kubernetes.io/projected/efc90399-0b15-4fc6-b441-d7df6925c8aa-kube-api-access-xkgh9\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666205 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") pod \"e42b654e-f003-45dd-a7c4-07655514643e\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") pod \"e42b654e-f003-45dd-a7c4-07655514643e\" (UID: \"e42b654e-f003-45dd-a7c4-07655514643e\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") pod \"7d1dfe60-98b0-4644-b063-831293f9bd5c\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.666427 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") pod \"7d1dfe60-98b0-4644-b063-831293f9bd5c\" (UID: \"7d1dfe60-98b0-4644-b063-831293f9bd5c\") " Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.667340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d1dfe60-98b0-4644-b063-831293f9bd5c" (UID: "7d1dfe60-98b0-4644-b063-831293f9bd5c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.667396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e42b654e-f003-45dd-a7c4-07655514643e" (UID: "e42b654e-f003-45dd-a7c4-07655514643e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.676231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8" (OuterVolumeSpecName: "kube-api-access-s5zc8") pod "7d1dfe60-98b0-4644-b063-831293f9bd5c" (UID: "7d1dfe60-98b0-4644-b063-831293f9bd5c"). InnerVolumeSpecName "kube-api-access-s5zc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.676446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz" (OuterVolumeSpecName: "kube-api-access-5thvz") pod "e42b654e-f003-45dd-a7c4-07655514643e" (UID: "e42b654e-f003-45dd-a7c4-07655514643e"). InnerVolumeSpecName "kube-api-access-5thvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5thvz\" (UniqueName: \"kubernetes.io/projected/e42b654e-f003-45dd-a7c4-07655514643e-kube-api-access-5thvz\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768850 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e42b654e-f003-45dd-a7c4-07655514643e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768861 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5zc8\" (UniqueName: \"kubernetes.io/projected/7d1dfe60-98b0-4644-b063-831293f9bd5c-kube-api-access-s5zc8\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4795]: I0320 17:39:20.768869 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d1dfe60-98b0-4644-b063-831293f9bd5c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.006974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4l89c" event={"ID":"efc90399-0b15-4fc6-b441-d7df6925c8aa","Type":"ContainerDied","Data":"700d362b8c96896ed216b0973f95f5cdb189f63903ab165abd13610fbb69975d"} Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.007011 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700d362b8c96896ed216b0973f95f5cdb189f63903ab165abd13610fbb69975d" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.007054 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4l89c" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.009717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7051-account-create-update-d2d7p" event={"ID":"e42b654e-f003-45dd-a7c4-07655514643e","Type":"ContainerDied","Data":"776a14a7a36db1bed300198a0a46287925190b675f42aee4a98e85192fcad1df"} Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.009764 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="776a14a7a36db1bed300198a0a46287925190b675f42aee4a98e85192fcad1df" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.009828 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7051-account-create-update-d2d7p" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fqwkd" event={"ID":"7d1dfe60-98b0-4644-b063-831293f9bd5c","Type":"ContainerDied","Data":"23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e"} Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020194 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23fc3b254bcb14b0eb8e333a8d42731b101333bbd809d7019481d5246f86363e" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020431 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fqwkd" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020632 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" containerID="cri-o://47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020788 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" containerID="cri-o://b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020853 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" containerID="cri-o://6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.020914 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" containerID="cri-o://71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" gracePeriod=30 Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.336920 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.385378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") pod \"7bd6f80f-7908-42b5-b32a-63d585bd9194\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.385506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") pod \"7bd6f80f-7908-42b5-b32a-63d585bd9194\" (UID: \"7bd6f80f-7908-42b5-b32a-63d585bd9194\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.387427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bd6f80f-7908-42b5-b32a-63d585bd9194" (UID: "7bd6f80f-7908-42b5-b32a-63d585bd9194"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.398842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg" (OuterVolumeSpecName: "kube-api-access-kntrg") pod "7bd6f80f-7908-42b5-b32a-63d585bd9194" (UID: "7bd6f80f-7908-42b5-b32a-63d585bd9194"). InnerVolumeSpecName "kube-api-access-kntrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.487272 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kntrg\" (UniqueName: \"kubernetes.io/projected/7bd6f80f-7908-42b5-b32a-63d585bd9194-kube-api-access-kntrg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.487517 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bd6f80f-7908-42b5-b32a-63d585bd9194-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.646706 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.697460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") pod \"9f5daae9-920d-496a-ad6a-c016cfb82250\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.697536 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") pod \"9f5daae9-920d-496a-ad6a-c016cfb82250\" (UID: \"9f5daae9-920d-496a-ad6a-c016cfb82250\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.698415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f5daae9-920d-496a-ad6a-c016cfb82250" (UID: "9f5daae9-920d-496a-ad6a-c016cfb82250"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.706858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg" (OuterVolumeSpecName: "kube-api-access-kfrfg") pod "9f5daae9-920d-496a-ad6a-c016cfb82250" (UID: "9f5daae9-920d-496a-ad6a-c016cfb82250"). InnerVolumeSpecName "kube-api-access-kfrfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.750021 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.799751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") pod \"65a51797-b6d0-4b5b-9927-54d4b965469e\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.799810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") pod \"65a51797-b6d0-4b5b-9927-54d4b965469e\" (UID: \"65a51797-b6d0-4b5b-9927-54d4b965469e\") " Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.800436 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrfg\" (UniqueName: \"kubernetes.io/projected/9f5daae9-920d-496a-ad6a-c016cfb82250-kube-api-access-kfrfg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.800456 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f5daae9-920d-496a-ad6a-c016cfb82250-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.801072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65a51797-b6d0-4b5b-9927-54d4b965469e" (UID: "65a51797-b6d0-4b5b-9927-54d4b965469e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.808201 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr" (OuterVolumeSpecName: "kube-api-access-z26gr") pod "65a51797-b6d0-4b5b-9927-54d4b965469e" (UID: "65a51797-b6d0-4b5b-9927-54d4b965469e"). InnerVolumeSpecName "kube-api-access-z26gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.901693 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65a51797-b6d0-4b5b-9927-54d4b965469e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.901719 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z26gr\" (UniqueName: \"kubernetes.io/projected/65a51797-b6d0-4b5b-9927-54d4b965469e-kube-api-access-z26gr\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:21 crc kubenswrapper[4795]: I0320 17:39:21.923324 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003444 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") pod \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\" (UID: \"2df26ac7-bc78-4c22-9b4f-f3797a09bd53\") " Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.003991 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.004290 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.004309 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.009327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts" (OuterVolumeSpecName: "scripts") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.013122 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw" (OuterVolumeSpecName: "kube-api-access-tk8fw") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "kube-api-access-tk8fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.030734 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032213 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" exitCode=0 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032242 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" exitCode=2 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032253 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" exitCode=0 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032260 4795 generic.go:334] "Generic (PLEG): container finished" podID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" exitCode=0 Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032629 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2df26ac7-bc78-4c22-9b4f-f3797a09bd53","Type":"ContainerDied","Data":"7d09de57ae215285b7a1c023830c93b72b04b973baabd25b25f676f4a51305aa"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.032750 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.044297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wd5n7" event={"ID":"7bd6f80f-7908-42b5-b32a-63d585bd9194","Type":"ContainerDied","Data":"970949fc704fe767a632d55a42c62ddb8f7a120a1f8f4ea713ca42632765cd3e"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.044339 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970949fc704fe767a632d55a42c62ddb8f7a120a1f8f4ea713ca42632765cd3e" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.044401 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wd5n7" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.060948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3748-account-create-update-j2khv" event={"ID":"9f5daae9-920d-496a-ad6a-c016cfb82250","Type":"ContainerDied","Data":"bf45615034cdb988e7a4ea9c726e6ec289a16729b8bfc6970ea397d199a8e5a3"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.060992 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf45615034cdb988e7a4ea9c726e6ec289a16729b8bfc6970ea397d199a8e5a3" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.061098 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3748-account-create-update-j2khv" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.071134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-100d-account-create-update-7l925" event={"ID":"65a51797-b6d0-4b5b-9927-54d4b965469e","Type":"ContainerDied","Data":"4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248"} Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.072070 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7cce29ceb65feb9ef06ec75eb798c325695612a7e7d4fc30b5b34ef505a248" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.072524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-100d-account-create-update-7l925" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.076077 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.105752 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.107918 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.107996 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk8fw\" (UniqueName: \"kubernetes.io/projected/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-kube-api-access-tk8fw\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.122437 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.134914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.148417 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.163525 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.163967 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164007 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164034 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.164363 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164398 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.164637 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164654 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.164667 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data" (OuterVolumeSpecName: "config-data") pod "2df26ac7-bc78-4c22-9b4f-f3797a09bd53" (UID: "2df26ac7-bc78-4c22-9b4f-f3797a09bd53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.165423 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165441 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165455 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165667 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165711 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165943 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.165970 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166220 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166429 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166445 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166607 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166621 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166806 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166828 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166968 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.166986 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167167 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167185 4795 scope.go:117] "RemoveContainer" containerID="b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167400 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a"} err="failed to get container status \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": rpc error: code = NotFound desc = could not find container \"b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a\": container with ID starting with b671124976ac4be8dc50004f4aa8064c62e6c4133e559b17ad9854be69053b8a not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167416 4795 scope.go:117] "RemoveContainer" containerID="6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167552 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02"} err="failed to get container status \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": rpc error: code = NotFound desc = could not find container \"6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02\": container with ID starting with 6de538c1ccd6555283b203537fd2e941ae2dc249f24c485631efa9eb79b72b02 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167568 4795 scope.go:117] "RemoveContainer" containerID="71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167726 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db"} err="failed to get container status \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": rpc error: code = NotFound desc = could not find container \"71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db\": container with ID starting with 71fb5de1963c75bbd4901e19e639f4fe731f18fb12cad907aa5c88d266eae2db not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167743 4795 scope.go:117] "RemoveContainer" containerID="47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.167896 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66"} err="failed to get container status \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": rpc error: code = NotFound desc = could not find container \"47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66\": container with ID starting with 47d8484eac228b43a5f3acb3195617a7f13cca401a3b2da9213779bcb2761f66 not found: ID does not exist" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.209910 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.209940 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df26ac7-bc78-4c22-9b4f-f3797a09bd53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.399758 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.406753 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.426871 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427474 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427558 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427624 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427676 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427763 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427816 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427877 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42b654e-f003-45dd-a7c4-07655514643e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.427931 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42b654e-f003-45dd-a7c4-07655514643e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.427990 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428039 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428089 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428136 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428184 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428230 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428402 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428451 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: E0320 17:39:22.428509 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428807 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428878 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42b654e-f003-45dd-a7c4-07655514643e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428935 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="sg-core" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.428993 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429058 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="proxy-httpd" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429116 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429179 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" containerName="mariadb-database-create" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" containerName="mariadb-account-create-update" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429294 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-notification-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.429347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" containerName="ceilometer-central-agent" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.430765 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.439134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.439417 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.444417 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516334 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.516652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.618854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.618948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.618995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.619825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.623486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.624007 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.624303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.631326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.638337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"ceilometer-0\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " pod="openstack/ceilometer-0" Mar 20 17:39:22 crc kubenswrapper[4795]: I0320 17:39:22.746073 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:23 crc kubenswrapper[4795]: I0320 17:39:23.261347 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df26ac7-bc78-4c22-9b4f-f3797a09bd53" path="/var/lib/kubelet/pods/2df26ac7-bc78-4c22-9b4f-f3797a09bd53/volumes" Mar 20 17:39:23 crc kubenswrapper[4795]: W0320 17:39:23.283512 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd66859dd_adc4_48d7_8fad_7f536004b0bb.slice/crio-93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab WatchSource:0}: Error finding container 93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab: Status 404 returned error can't find the container with id 93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab Mar 20 17:39:23 crc kubenswrapper[4795]: I0320 17:39:23.285520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.092622 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerID="da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3" exitCode=137 Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.092678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerDied","Data":"da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3"} Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.094857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757"} Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.094886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab"} Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.547826 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668564 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.668963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") pod \"d3e822b2-0b57-4f89-ab29-caeb483457a1\" (UID: \"d3e822b2-0b57-4f89-ab29-caeb483457a1\") " Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.669285 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs" (OuterVolumeSpecName: "logs") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.669635 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3e822b2-0b57-4f89-ab29-caeb483457a1-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.676981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq" (OuterVolumeSpecName: "kube-api-access-d6ghq") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "kube-api-access-d6ghq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.693302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.697638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.716124 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts" (OuterVolumeSpecName: "scripts") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.716554 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data" (OuterVolumeSpecName: "config-data") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.723857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d3e822b2-0b57-4f89-ab29-caeb483457a1" (UID: "d3e822b2-0b57-4f89-ab29-caeb483457a1"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771446 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771489 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e822b2-0b57-4f89-ab29-caeb483457a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771506 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6ghq\" (UniqueName: \"kubernetes.io/projected/d3e822b2-0b57-4f89-ab29-caeb483457a1-kube-api-access-d6ghq\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771522 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771535 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:24 crc kubenswrapper[4795]: I0320 17:39:24.771545 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3e822b2-0b57-4f89-ab29-caeb483457a1-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.103321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-698b6ff5c8-7p5rs" event={"ID":"d3e822b2-0b57-4f89-ab29-caeb483457a1","Type":"ContainerDied","Data":"b4a6e3e35ee28437bb36524dd75862315999254a91ff6cc4192d379a2a0e45e4"} Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.103383 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-698b6ff5c8-7p5rs" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.103371 4795 scope.go:117] "RemoveContainer" containerID="5be00c0e636ec09ccd42a36c542755b2d984e3e3c6dddd06a91f3eb8b8a7efdb" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.109977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b"} Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.140340 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.156851 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-698b6ff5c8-7p5rs"] Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.188158 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.188479 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" containerID="cri-o://601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" gracePeriod=30 Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.188858 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" containerID="cri-o://d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" gracePeriod=30 Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.273038 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" path="/var/lib/kubelet/pods/d3e822b2-0b57-4f89-ab29-caeb483457a1/volumes" Mar 20 17:39:25 crc kubenswrapper[4795]: I0320 17:39:25.273931 4795 scope.go:117] "RemoveContainer" containerID="da11e766148fb6f38d02c50468b495d9c10ec9fe653ddad3b144b8edd961b2d3" Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.122836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd"} Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.125349 4795 generic.go:334] "Generic (PLEG): container finished" podID="6067c03d-732b-40d9-b017-0365677c39b7" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" exitCode=143 Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.125534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerDied","Data":"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4"} Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.132097 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.132353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" containerID="cri-o://fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" gracePeriod=30 Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.132480 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" containerID="cri-o://d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" gracePeriod=30 Mar 20 17:39:26 crc kubenswrapper[4795]: I0320 17:39:26.241709 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.139325 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" exitCode=143 Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.139448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerDied","Data":"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006"} Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.871492 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:39:27 crc kubenswrapper[4795]: E0320 17:39:27.872136 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872153 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" Mar 20 17:39:27 crc kubenswrapper[4795]: E0320 17:39:27.872165 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872171 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872318 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872342 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e822b2-0b57-4f89-ab29-caeb483457a1" containerName="horizon-log" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.872889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.875526 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.887223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.887927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 17:39:27 crc kubenswrapper[4795]: I0320 17:39:27.888272 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-c5nq5" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.043771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.145611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.145843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.145980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.146025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.149918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.150592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.151041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerStarted","Data":"00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e"} Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152380 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152363 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" containerID="cri-o://53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152429 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" containerID="cri-o://00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152458 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" containerID="cri-o://35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.152423 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" containerID="cri-o://fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd" gracePeriod=30 Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.168397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"nova-cell0-conductor-db-sync-jfhz8\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.188511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.192987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.526997587 podStartE2EDuration="6.192967997s" podCreationTimestamp="2026-03-20 17:39:22 +0000 UTC" firstStartedPulling="2026-03-20 17:39:23.285865885 +0000 UTC m=+1306.743897426" lastFinishedPulling="2026-03-20 17:39:26.951836295 +0000 UTC m=+1310.409867836" observedRunningTime="2026-03-20 17:39:28.18734053 +0000 UTC m=+1311.645372071" watchObservedRunningTime="2026-03-20 17:39:28.192967997 +0000 UTC m=+1311.650999528" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.643572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.860057 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866384 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") pod \"6067c03d-732b-40d9-b017-0365677c39b7\" (UID: \"6067c03d-732b-40d9-b017-0365677c39b7\") " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.866995 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs" (OuterVolumeSpecName: "logs") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.871876 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.880734 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts" (OuterVolumeSpecName: "scripts") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.881052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg" (OuterVolumeSpecName: "kube-api-access-22mcg") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "kube-api-access-22mcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.926507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.936568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data" (OuterVolumeSpecName: "config-data") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.955404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6067c03d-732b-40d9-b017-0365677c39b7" (UID: "6067c03d-732b-40d9-b017-0365677c39b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968053 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968231 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968289 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22mcg\" (UniqueName: \"kubernetes.io/projected/6067c03d-732b-40d9-b017-0365677c39b7-kube-api-access-22mcg\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968342 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6067c03d-732b-40d9-b017-0365677c39b7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968392 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968467 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968535 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.968596 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6067c03d-732b-40d9-b017-0365677c39b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:28 crc kubenswrapper[4795]: I0320 17:39:28.987812 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.070433 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160815 4795 generic.go:334] "Generic (PLEG): container finished" podID="6067c03d-732b-40d9-b017-0365677c39b7" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" exitCode=0 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerDied","Data":"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6067c03d-732b-40d9-b017-0365677c39b7","Type":"ContainerDied","Data":"79719142974a75aa1ceb9ca03ec61b98a42d47f6e27982f5c5a5e0502981ad81"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.160913 4795 scope.go:117] "RemoveContainer" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.161015 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.166581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerStarted","Data":"3c4ecc47c641ea00ae26b045f58c4b097551f488bd93d478e0c38b26018a528e"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170428 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e" exitCode=0 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170458 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd" exitCode=2 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170468 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b" exitCode=0 Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.170819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b"} Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.275372 4795 scope.go:117] "RemoveContainer" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.289131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.299023 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.311030 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.311574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.311645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.311729 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.311780 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.312007 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-httpd" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.312069 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6067c03d-732b-40d9-b017-0365677c39b7" containerName="glance-log" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.313091 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.316457 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325090 4795 scope.go:117] "RemoveContainer" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325235 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.325563 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad\": container with ID starting with d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad not found: ID does not exist" containerID="d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325611 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad"} err="failed to get container status \"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad\": rpc error: code = NotFound desc = could not find container \"d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad\": container with ID starting with d9c93e776e41f994d43543284a7433c6a284941fdc8471c5c8d5a4de3eb991ad not found: ID does not exist" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325639 4795 scope.go:117] "RemoveContainer" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" Mar 20 17:39:29 crc kubenswrapper[4795]: E0320 17:39:29.325910 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4\": container with ID starting with 601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4 not found: ID does not exist" containerID="601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.325940 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4"} err="failed to get container status \"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4\": rpc error: code = NotFound desc = could not find container \"601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4\": container with ID starting with 601523ec3bbc3539dd8c3ac57f7998950bb4d6d9fce23ad3fa8291eed4670fe4 not found: ID does not exist" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.326121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.491916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd45l\" (UniqueName: \"kubernetes.io/projected/264c2db4-1919-41ce-aea3-bd777167a9ca-kube-api-access-pd45l\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.491971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492578 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.492657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.595973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd45l\" (UniqueName: \"kubernetes.io/projected/264c2db4-1919-41ce-aea3-bd777167a9ca-kube-api-access-pd45l\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596071 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596318 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.596876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.601456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.603101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.603099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-scripts\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.604425 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/264c2db4-1919-41ce-aea3-bd777167a9ca-logs\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.607583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264c2db4-1919-41ce-aea3-bd777167a9ca-config-data\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.618310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd45l\" (UniqueName: \"kubernetes.io/projected/264c2db4-1919-41ce-aea3-bd777167a9ca-kube-api-access-pd45l\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.622809 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"264c2db4-1919-41ce-aea3-bd777167a9ca\") " pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.710616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.794175 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900295 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.900380 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.901064 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.901092 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") pod \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\" (UID: \"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197\") " Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.904183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.904729 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs" (OuterVolumeSpecName: "logs") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.905176 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8" (OuterVolumeSpecName: "kube-api-access-d8cs8") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "kube-api-access-d8cs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.907330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.908839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts" (OuterVolumeSpecName: "scripts") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.965014 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.980878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data" (OuterVolumeSpecName: "config-data") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:29 crc kubenswrapper[4795]: I0320 17:39:29.989873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" (UID: "9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003182 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003233 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003246 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003255 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003263 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003271 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003279 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.003287 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8cs8\" (UniqueName: \"kubernetes.io/projected/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197-kube-api-access-d8cs8\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.021087 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.105195 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192072 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" exitCode=0 Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192112 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerDied","Data":"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619"} Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197","Type":"ContainerDied","Data":"763133b846bce72cdf94f169388305eb03a69632768733042144310abb80652c"} Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192160 4795 scope.go:117] "RemoveContainer" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.192286 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.232368 4795 scope.go:117] "RemoveContainer" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.253712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.261424 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262109 4795 scope.go:117] "RemoveContainer" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.262476 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619\": container with ID starting with d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619 not found: ID does not exist" containerID="d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262507 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619"} err="failed to get container status \"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619\": rpc error: code = NotFound desc = could not find container \"d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619\": container with ID starting with d3d72e219f8b1bf982529265fe542d2c1935f03b7fa07f60489fa6b56fb7b619 not found: ID does not exist" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262526 4795 scope.go:117] "RemoveContainer" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.262838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006\": container with ID starting with fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006 not found: ID does not exist" containerID="fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.262859 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006"} err="failed to get container status \"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006\": rpc error: code = NotFound desc = could not find container \"fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006\": container with ID starting with fa2afa674077be39dcbfd942a775bb58286f0fa544525a38561678e9d7756006 not found: ID does not exist" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278303 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.278722 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" Mar 20 17:39:30 crc kubenswrapper[4795]: E0320 17:39:30.278751 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278759 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-log" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.278929 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" containerName="glance-httpd" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.279757 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.283018 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.283170 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.290351 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.341553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnm4\" (UniqueName: \"kubernetes.io/projected/81d40eb0-c26d-46e7-b8be-631de2f502b9-kube-api-access-rgnm4\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.424483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-logs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnm4\" (UniqueName: \"kubernetes.io/projected/81d40eb0-c26d-46e7-b8be-631de2f502b9-kube-api-access-rgnm4\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-logs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.526950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527180 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.527249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81d40eb0-c26d-46e7-b8be-631de2f502b9-logs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.531302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.533788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.534231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.551583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnm4\" (UniqueName: \"kubernetes.io/projected/81d40eb0-c26d-46e7-b8be-631de2f502b9-kube-api-access-rgnm4\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.551905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d40eb0-c26d-46e7-b8be-631de2f502b9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.587347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"81d40eb0-c26d-46e7-b8be-631de2f502b9\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:39:30 crc kubenswrapper[4795]: I0320 17:39:30.654202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.201929 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:39:31 crc kubenswrapper[4795]: W0320 17:39:31.209878 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d40eb0_c26d_46e7_b8be_631de2f502b9.slice/crio-f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5 WatchSource:0}: Error finding container f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5: Status 404 returned error can't find the container with id f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5 Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.212349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"264c2db4-1919-41ce-aea3-bd777167a9ca","Type":"ContainerStarted","Data":"c8d456cb515dde74965a84bbb63a0b4e1a16de133a8a826140f21c0bc2ec153f"} Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.212383 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"264c2db4-1919-41ce-aea3-bd777167a9ca","Type":"ContainerStarted","Data":"f5de7a904eadf145932872ba174825a1efd9acaa99b0bebe8335c2df8a0df661"} Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.269253 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6067c03d-732b-40d9-b017-0365677c39b7" path="/var/lib/kubelet/pods/6067c03d-732b-40d9-b017-0365677c39b7/volumes" Mar 20 17:39:31 crc kubenswrapper[4795]: I0320 17:39:31.270342 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197" path="/var/lib/kubelet/pods/9e24d4d3-23ba-4ab0-a5af-3a6dfc19c197/volumes" Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.230237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81d40eb0-c26d-46e7-b8be-631de2f502b9","Type":"ContainerStarted","Data":"d6ccf3673a3579c18c8c6ffc094d3d920e67e767ceecf42ab48437e6e88280d7"} Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.230922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81d40eb0-c26d-46e7-b8be-631de2f502b9","Type":"ContainerStarted","Data":"f23a74eea88d853616f300f0fb298be2cb779054021a542cd507b565a39de8d5"} Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.232679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"264c2db4-1919-41ce-aea3-bd777167a9ca","Type":"ContainerStarted","Data":"f7051b18aca469ed7b78a3975127f1fc84912f0ba6390c83082798937e7d0697"} Mar 20 17:39:32 crc kubenswrapper[4795]: I0320 17:39:32.248882 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.2488651060000002 podStartE2EDuration="3.248865106s" podCreationTimestamp="2026-03-20 17:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:32.246637607 +0000 UTC m=+1315.704669148" watchObservedRunningTime="2026-03-20 17:39:32.248865106 +0000 UTC m=+1315.706896647" Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.247723 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerID="53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" exitCode=0 Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.247904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757"} Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.250598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81d40eb0-c26d-46e7-b8be-631de2f502b9","Type":"ContainerStarted","Data":"d29803125c153802162e5de76862336a48b202311af90c926a1dad202c4a61a5"} Mar 20 17:39:33 crc kubenswrapper[4795]: I0320 17:39:33.269891 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.269872952 podStartE2EDuration="3.269872952s" podCreationTimestamp="2026-03-20 17:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:33.268791287 +0000 UTC m=+1316.726822868" watchObservedRunningTime="2026-03-20 17:39:33.269872952 +0000 UTC m=+1316.727904513" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.419666 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568731 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568785 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568852 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.568944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569071 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569789 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.569996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.573368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts" (OuterVolumeSpecName: "scripts") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.574893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr" (OuterVolumeSpecName: "kube-api-access-t27qr") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "kube-api-access-t27qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.602469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.645288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.671410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data" (OuterVolumeSpecName: "config-data") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.671671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") pod \"d66859dd-adc4-48d7-8fad-7f536004b0bb\" (UID: \"d66859dd-adc4-48d7-8fad-7f536004b0bb\") " Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672271 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27qr\" (UniqueName: \"kubernetes.io/projected/d66859dd-adc4-48d7-8fad-7f536004b0bb-kube-api-access-t27qr\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672302 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672322 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672341 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672358 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66859dd-adc4-48d7-8fad-7f536004b0bb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:37 crc kubenswrapper[4795]: W0320 17:39:37.672452 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d66859dd-adc4-48d7-8fad-7f536004b0bb/volumes/kubernetes.io~secret/config-data Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.672468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data" (OuterVolumeSpecName: "config-data") pod "d66859dd-adc4-48d7-8fad-7f536004b0bb" (UID: "d66859dd-adc4-48d7-8fad-7f536004b0bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:37 crc kubenswrapper[4795]: I0320 17:39:37.773811 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66859dd-adc4-48d7-8fad-7f536004b0bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.299456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerStarted","Data":"181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e"} Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.304254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d66859dd-adc4-48d7-8fad-7f536004b0bb","Type":"ContainerDied","Data":"93d7fbfb53b69d60c3384528edac5066b0162c828391056677c83f22c22105ab"} Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.304310 4795 scope.go:117] "RemoveContainer" containerID="00544b53a989877fad4d8332859c008594ef58360004ffce2654b3a06e72a36e" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.304314 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.325520 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" podStartSLOduration=2.536129034 podStartE2EDuration="11.325499199s" podCreationTimestamp="2026-03-20 17:39:27 +0000 UTC" firstStartedPulling="2026-03-20 17:39:28.648220389 +0000 UTC m=+1312.106251930" lastFinishedPulling="2026-03-20 17:39:37.437590514 +0000 UTC m=+1320.895622095" observedRunningTime="2026-03-20 17:39:38.318280732 +0000 UTC m=+1321.776312273" watchObservedRunningTime="2026-03-20 17:39:38.325499199 +0000 UTC m=+1321.783530740" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.359984 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.371025 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400121 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400580 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400602 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400619 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400628 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400643 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400652 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" Mar 20 17:39:38 crc kubenswrapper[4795]: E0320 17:39:38.400665 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400675 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-central-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400954 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="sg-core" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400964 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="proxy-httpd" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.400981 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" containerName="ceilometer-notification-agent" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.402727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.405365 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.405572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.411916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.485903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.486867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.487507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.589293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.590396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.591063 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.594011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.595080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.596103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.597365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.625241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"ceilometer-0\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " pod="openstack/ceilometer-0" Mar 20 17:39:38 crc kubenswrapper[4795]: I0320 17:39:38.725817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.143018 4795 scope.go:117] "RemoveContainer" containerID="fcafe13e4b4f6f7f50c947513c3f7b933825964f518fc8ea72d3a2aa5ba393dd" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.173017 4795 scope.go:117] "RemoveContainer" containerID="35aea45ae59f7305316fd0a425bf785500761fd9cebc787d5c218633ef2f618b" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.262785 4795 scope.go:117] "RemoveContainer" containerID="53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.271829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66859dd-adc4-48d7-8fad-7f536004b0bb" path="/var/lib/kubelet/pods/d66859dd-adc4-48d7-8fad-7f536004b0bb/volumes" Mar 20 17:39:39 crc kubenswrapper[4795]: E0320 17:39:39.327870 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757\": container with ID starting with 53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757 not found: ID does not exist" containerID="53554bccf79715e981642fc80363936035dffdd9625374dabd4bad98fbcaa757" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.649054 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:39 crc kubenswrapper[4795]: W0320 17:39:39.664992 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e54cb1b_1f35_4344_899a_395d140ac8c3.slice/crio-bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c WatchSource:0}: Error finding container bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c: Status 404 returned error can't find the container with id bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.711734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.711781 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.744356 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:39:39 crc kubenswrapper[4795]: I0320 17:39:39.758218 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.341734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c"} Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.342130 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.342151 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.654838 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.654887 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.714640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:40 crc kubenswrapper[4795]: I0320 17:39:40.726538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.353612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c"} Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.353820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98"} Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.356045 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:41 crc kubenswrapper[4795]: I0320 17:39:41.356086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.180786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.369254 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.369959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2"} Mar 20 17:39:42 crc kubenswrapper[4795]: I0320 17:39:42.390104 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:39:43 crc kubenswrapper[4795]: I0320 17:39:43.264391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:43 crc kubenswrapper[4795]: I0320 17:39:43.400481 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:39:43 crc kubenswrapper[4795]: I0320 17:39:43.408578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:39:45 crc kubenswrapper[4795]: I0320 17:39:45.435210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerStarted","Data":"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c"} Mar 20 17:39:45 crc kubenswrapper[4795]: I0320 17:39:45.435743 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:45 crc kubenswrapper[4795]: I0320 17:39:45.470027 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.013865799 podStartE2EDuration="7.47001067s" podCreationTimestamp="2026-03-20 17:39:38 +0000 UTC" firstStartedPulling="2026-03-20 17:39:39.668574982 +0000 UTC m=+1323.126606523" lastFinishedPulling="2026-03-20 17:39:45.124719853 +0000 UTC m=+1328.582751394" observedRunningTime="2026-03-20 17:39:45.464072973 +0000 UTC m=+1328.922104514" watchObservedRunningTime="2026-03-20 17:39:45.47001067 +0000 UTC m=+1328.928042211" Mar 20 17:39:47 crc kubenswrapper[4795]: I0320 17:39:47.269124 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465271 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" containerID="cri-o://e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" gracePeriod=30 Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465332 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" containerID="cri-o://2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" gracePeriod=30 Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465429 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" containerID="cri-o://fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" gracePeriod=30 Mar 20 17:39:48 crc kubenswrapper[4795]: I0320 17:39:48.465353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" containerID="cri-o://1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" gracePeriod=30 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.474700 4795 generic.go:334] "Generic (PLEG): container finished" podID="02a8b32b-fab3-401f-b667-592c8840bd97" containerID="181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e" exitCode=0 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.475019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerDied","Data":"181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479449 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" exitCode=0 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479470 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" exitCode=2 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479477 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" exitCode=0 Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.479517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c"} Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.856735 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.887908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.887979 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888032 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888121 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") pod \"7e54cb1b-1f35-4344-899a-395d140ac8c3\" (UID: \"7e54cb1b-1f35-4344-899a-395d140ac8c3\") " Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888852 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.888862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.894199 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts" (OuterVolumeSpecName: "scripts") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.894261 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4" (OuterVolumeSpecName: "kube-api-access-x96j4") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "kube-api-access-x96j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.932061 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.980960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data" (OuterVolumeSpecName: "config-data") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990754 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990788 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990802 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x96j4\" (UniqueName: \"kubernetes.io/projected/7e54cb1b-1f35-4344-899a-395d140ac8c3-kube-api-access-x96j4\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.990828 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e54cb1b-1f35-4344-899a-395d140ac8c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:49 crc kubenswrapper[4795]: I0320 17:39:49.994444 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e54cb1b-1f35-4344-899a-395d140ac8c3" (UID: "7e54cb1b-1f35-4344-899a-395d140ac8c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.092067 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e54cb1b-1f35-4344-899a-395d140ac8c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.491939 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" exitCode=0 Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492014 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98"} Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e54cb1b-1f35-4344-899a-395d140ac8c3","Type":"ContainerDied","Data":"bd89bf849cc06624e535b1c043affece374c22e807d01d0820b5405eacdd2f3c"} Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.492117 4795 scope.go:117] "RemoveContainer" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.542290 4795 scope.go:117] "RemoveContainer" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.548910 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.558276 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.578870 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579278 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579328 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579365 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579374 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.579387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579395 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579597 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="proxy-httpd" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-central-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="ceilometer-notification-agent" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.579655 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" containerName="sg-core" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.581580 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.587132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.613931 4795 scope.go:117] "RemoveContainer" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.613972 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.614298 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.650417 4795 scope.go:117] "RemoveContainer" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.688859 4795 scope.go:117] "RemoveContainer" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.691100 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c\": container with ID starting with 2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c not found: ID does not exist" containerID="2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691149 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c"} err="failed to get container status \"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c\": rpc error: code = NotFound desc = could not find container \"2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c\": container with ID starting with 2393691e45183bc4f3456c0337568663a94aac5d97d8e584903e079e7ea2292c not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691174 4795 scope.go:117] "RemoveContainer" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.691531 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2\": container with ID starting with fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2 not found: ID does not exist" containerID="fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691582 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2"} err="failed to get container status \"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2\": rpc error: code = NotFound desc = could not find container \"fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2\": container with ID starting with fbb65d42ca88be23f0ec0100bb1ff65d4be43f53ebacd656709abbdc363c90b2 not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.691602 4795 scope.go:117] "RemoveContainer" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.691994 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c\": container with ID starting with 1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c not found: ID does not exist" containerID="1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.692012 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c"} err="failed to get container status \"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c\": rpc error: code = NotFound desc = could not find container \"1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c\": container with ID starting with 1d6e88b5d356f8f4b09fef778769b5bb90dc80da3dc4f6023f6a6abf788a686c not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.692026 4795 scope.go:117] "RemoveContainer" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" Mar 20 17:39:50 crc kubenswrapper[4795]: E0320 17:39:50.692245 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98\": container with ID starting with e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98 not found: ID does not exist" containerID="e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.692259 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98"} err="failed to get container status \"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98\": rpc error: code = NotFound desc = could not find container \"e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98\": container with ID starting with e2a57b52a7f485bc92076c3d1df2f504a6ee341efae7284fbe71550928071e98 not found: ID does not exist" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.715373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817237 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.817410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.818885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.818967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.822301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.822402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.823213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.823559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.833790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"ceilometer-0\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " pod="openstack/ceilometer-0" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.897575 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:50 crc kubenswrapper[4795]: I0320 17:39:50.949792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.020013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.020464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.021205 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.021355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") pod \"02a8b32b-fab3-401f-b667-592c8840bd97\" (UID: \"02a8b32b-fab3-401f-b667-592c8840bd97\") " Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.025814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts" (OuterVolumeSpecName: "scripts") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.026060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm" (OuterVolumeSpecName: "kube-api-access-vfcrm") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "kube-api-access-vfcrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.055081 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data" (OuterVolumeSpecName: "config-data") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.064769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02a8b32b-fab3-401f-b667-592c8840bd97" (UID: "02a8b32b-fab3-401f-b667-592c8840bd97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124891 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfcrm\" (UniqueName: \"kubernetes.io/projected/02a8b32b-fab3-401f-b667-592c8840bd97-kube-api-access-vfcrm\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124926 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124935 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.124944 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02a8b32b-fab3-401f-b667-592c8840bd97-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.263250 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e54cb1b-1f35-4344-899a-395d140ac8c3" path="/var/lib/kubelet/pods/7e54cb1b-1f35-4344-899a-395d140ac8c3/volumes" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.391918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.503487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" event={"ID":"02a8b32b-fab3-401f-b667-592c8840bd97","Type":"ContainerDied","Data":"3c4ecc47c641ea00ae26b045f58c4b097551f488bd93d478e0c38b26018a528e"} Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.503545 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4ecc47c641ea00ae26b045f58c4b097551f488bd93d478e0c38b26018a528e" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.503541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jfhz8" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.506756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"39c11fccd9e673059022bf047af401ca209830155fcf251e8d72aeeb8fa6e0d2"} Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.593893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:39:51 crc kubenswrapper[4795]: E0320 17:39:51.594494 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" containerName="nova-cell0-conductor-db-sync" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.594516 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" containerName="nova-cell0-conductor-db-sync" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.594766 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" containerName="nova-cell0-conductor-db-sync" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.595503 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.597055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.597653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-c5nq5" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.620413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.736180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.736281 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9qgc\" (UniqueName: \"kubernetes.io/projected/5916e4d2-2863-4088-be97-cf368906820b-kube-api-access-r9qgc\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.736307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.837525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9qgc\" (UniqueName: \"kubernetes.io/projected/5916e4d2-2863-4088-be97-cf368906820b-kube-api-access-r9qgc\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.837564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.837659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.842719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.851744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5916e4d2-2863-4088-be97-cf368906820b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.863663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9qgc\" (UniqueName: \"kubernetes.io/projected/5916e4d2-2863-4088-be97-cf368906820b-kube-api-access-r9qgc\") pod \"nova-cell0-conductor-0\" (UID: \"5916e4d2-2863-4088-be97-cf368906820b\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:51 crc kubenswrapper[4795]: I0320 17:39:51.918717 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.164994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:39:52 crc kubenswrapper[4795]: W0320 17:39:52.171105 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5916e4d2_2863_4088_be97_cf368906820b.slice/crio-2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500 WatchSource:0}: Error finding container 2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500: Status 404 returned error can't find the container with id 2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500 Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.518863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5916e4d2-2863-4088-be97-cf368906820b","Type":"ContainerStarted","Data":"b21e2cbeacbd4128217704e7ce8b39085fc6ffcb9e96a06db0805f198d443a18"} Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.519277 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5916e4d2-2863-4088-be97-cf368906820b","Type":"ContainerStarted","Data":"2d8c939f22643424586783c3f78925e3a69fe31a7bdbcc29810c323e6661b500"} Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.519306 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 17:39:52 crc kubenswrapper[4795]: I0320 17:39:52.561788 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.561767744 podStartE2EDuration="1.561767744s" podCreationTimestamp="2026-03-20 17:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:52.536761308 +0000 UTC m=+1335.994792889" watchObservedRunningTime="2026-03-20 17:39:52.561767744 +0000 UTC m=+1336.019799305" Mar 20 17:39:53 crc kubenswrapper[4795]: I0320 17:39:53.547219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1"} Mar 20 17:39:55 crc kubenswrapper[4795]: I0320 17:39:55.580498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc"} Mar 20 17:39:55 crc kubenswrapper[4795]: I0320 17:39:55.581299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c"} Mar 20 17:39:58 crc kubenswrapper[4795]: I0320 17:39:58.613534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerStarted","Data":"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f"} Mar 20 17:39:58 crc kubenswrapper[4795]: I0320 17:39:58.614382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:39:58 crc kubenswrapper[4795]: I0320 17:39:58.668825 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.481911165 podStartE2EDuration="8.668801292s" podCreationTimestamp="2026-03-20 17:39:50 +0000 UTC" firstStartedPulling="2026-03-20 17:39:51.399063296 +0000 UTC m=+1334.857094847" lastFinishedPulling="2026-03-20 17:39:57.585953393 +0000 UTC m=+1341.043984974" observedRunningTime="2026-03-20 17:39:58.647077739 +0000 UTC m=+1342.105109290" watchObservedRunningTime="2026-03-20 17:39:58.668801292 +0000 UTC m=+1342.126832833" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.177019 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.178867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.181200 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.181463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.181520 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.187808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.353636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"auto-csr-approver-29567140-s5wtb\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.456101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"auto-csr-approver-29567140-s5wtb\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.484566 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"auto-csr-approver-29567140-s5wtb\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.503555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:00 crc kubenswrapper[4795]: I0320 17:40:00.980955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:40:00 crc kubenswrapper[4795]: W0320 17:40:00.992939 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e6fe9e_d22e_420c_b050_a00a53749f1f.slice/crio-69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f WatchSource:0}: Error finding container 69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f: Status 404 returned error can't find the container with id 69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f Mar 20 17:40:01 crc kubenswrapper[4795]: I0320 17:40:01.644208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" event={"ID":"f9e6fe9e-d22e-420c-b050-a00a53749f1f","Type":"ContainerStarted","Data":"69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f"} Mar 20 17:40:01 crc kubenswrapper[4795]: I0320 17:40:01.946500 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.449470 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.451044 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.453278 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.453584 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.461202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.608515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.709964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.718990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.723362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.723622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.764264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"nova-cell0-cell-mapping-kmgk9\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.769569 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.771041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.774128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.786141 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.787197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.859053 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.860214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.863192 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.867039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.880558 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.883214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.886068 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.914624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:02 crc kubenswrapper[4795]: I0320 17:40:02.944149 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.010263 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.012003 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.016872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.020506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.020645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.024595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.024651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.042148 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.058007 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"nova-api-0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.082547 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.084476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.100031 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118567 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.118634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.123074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.123125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.128101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.133092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.146663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"nova-scheduler-0\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.147141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"nova-cell1-novncproxy-0\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.153591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.197501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.220639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.222240 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.227551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.230923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.241804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"nova-metadata-0\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.322495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.323449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.324047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.324125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.324532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.325037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.345642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"dnsmasq-dns-757b4f8459-gnz2j\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.357339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.357451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.403447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.561081 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:40:03 crc kubenswrapper[4795]: W0320 17:40:03.603746 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18794d5c_e43a_44dc_9510_763a31275104.slice/crio-221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf WatchSource:0}: Error finding container 221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf: Status 404 returned error can't find the container with id 221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.663513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerStarted","Data":"221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf"} Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.665088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" event={"ID":"f9e6fe9e-d22e-420c-b050-a00a53749f1f","Type":"ContainerDied","Data":"8ac90f4263985d5a19d6f00ac01d70eb81681c5a298c4e2c5302052e573286a6"} Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.665157 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerID="8ac90f4263985d5a19d6f00ac01d70eb81681c5a298c4e2c5302052e573286a6" exitCode=0 Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.714533 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.856816 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.879017 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.880224 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.882400 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.883885 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:40:03 crc kubenswrapper[4795]: I0320 17:40:03.902012 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.006182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:04 crc kubenswrapper[4795]: W0320 17:40:04.019323 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d499f64_fbe0_4f89_af22_619a306e7857.slice/crio-033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0 WatchSource:0}: Error finding container 033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0: Status 404 returned error can't find the container with id 033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0 Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.026180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:04 crc kubenswrapper[4795]: W0320 17:40:04.026715 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d437e5_b643_4a6f_a9d9_50cf8166d0af.slice/crio-a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5 WatchSource:0}: Error finding container a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5: Status 404 returned error can't find the container with id a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5 Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.036061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.038896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.038982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.039010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.039081 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.140802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.144829 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.145277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.148090 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.161164 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"nova-cell1-conductor-db-sync-bcxtg\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.200924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.667489 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.720252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerStarted","Data":"ab84cd82bd3f42e54f99be97caf2de2ed929e55f48d616683e9c8f126716e9cd"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.729649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerStarted","Data":"a81953c7eff097e8b8de2cddd252282ad6966a7afa286fafb4eb334123de90a3"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.732237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerStarted","Data":"c016e3406d811f115716e682934de46c5a980e71042e6b56bc4a4a96322456f5"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.739463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerStarted","Data":"2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.740970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerStarted","Data":"a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.745929 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d499f64-fbe0-4f89-af22-619a306e7857" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" exitCode=0 Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.746518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerDied","Data":"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.746543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerStarted","Data":"033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0"} Mar 20 17:40:04 crc kubenswrapper[4795]: I0320 17:40:04.769735 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kmgk9" podStartSLOduration=2.769718817 podStartE2EDuration="2.769718817s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:04.760194968 +0000 UTC m=+1348.218226509" watchObservedRunningTime="2026-03-20 17:40:04.769718817 +0000 UTC m=+1348.227750358" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.139416 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.267824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") pod \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\" (UID: \"f9e6fe9e-d22e-420c-b050-a00a53749f1f\") " Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.272742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck" (OuterVolumeSpecName: "kube-api-access-52dck") pod "f9e6fe9e-d22e-420c-b050-a00a53749f1f" (UID: "f9e6fe9e-d22e-420c-b050-a00a53749f1f"). InnerVolumeSpecName "kube-api-access-52dck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.370439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52dck\" (UniqueName: \"kubernetes.io/projected/f9e6fe9e-d22e-420c-b050-a00a53749f1f-kube-api-access-52dck\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.756096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerStarted","Data":"de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.756217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerStarted","Data":"be403fd93662530286fc5651363ea195fe87d753696088488bd47663f1933769"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.760045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" event={"ID":"f9e6fe9e-d22e-420c-b050-a00a53749f1f","Type":"ContainerDied","Data":"69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.760076 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69fdf23b7c35c2ff8e00c0c6f8ae66a1a9ce3886878ec77522b3e11fdb94462f" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.760120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-s5wtb" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.763517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerStarted","Data":"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25"} Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.763583 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.775369 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" podStartSLOduration=2.7753529500000003 podStartE2EDuration="2.77535295s" podCreationTimestamp="2026-03-20 17:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:05.7728008 +0000 UTC m=+1349.230832351" watchObservedRunningTime="2026-03-20 17:40:05.77535295 +0000 UTC m=+1349.233384491" Mar 20 17:40:05 crc kubenswrapper[4795]: I0320 17:40:05.799733 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" podStartSLOduration=2.799714836 podStartE2EDuration="2.799714836s" podCreationTimestamp="2026-03-20 17:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:05.797149815 +0000 UTC m=+1349.255181356" watchObservedRunningTime="2026-03-20 17:40:05.799714836 +0000 UTC m=+1349.257746377" Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.207299 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.214171 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-glqvv"] Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.475536 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:06 crc kubenswrapper[4795]: I0320 17:40:06.508651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.269618 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1462264f-6c8a-4024-9465-3e7d2908ba24" path="/var/lib/kubelet/pods/1462264f-6c8a-4024-9465-3e7d2908ba24/volumes" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.779811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerStarted","Data":"65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.779931 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7" gracePeriod=30 Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerStarted","Data":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerStarted","Data":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782821 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" containerID="cri-o://989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" gracePeriod=30 Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.782874 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" containerID="cri-o://31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" gracePeriod=30 Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.786022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerStarted","Data":"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.788117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerStarted","Data":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.788142 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerStarted","Data":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.805352 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.971128764 podStartE2EDuration="5.805315083s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:04.028315855 +0000 UTC m=+1347.486347396" lastFinishedPulling="2026-03-20 17:40:06.862502174 +0000 UTC m=+1350.320533715" observedRunningTime="2026-03-20 17:40:07.796014381 +0000 UTC m=+1351.254045942" watchObservedRunningTime="2026-03-20 17:40:07.805315083 +0000 UTC m=+1351.263346634" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.831958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.691181171 podStartE2EDuration="5.831943451s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:03.721250289 +0000 UTC m=+1347.179281820" lastFinishedPulling="2026-03-20 17:40:06.862012569 +0000 UTC m=+1350.320044100" observedRunningTime="2026-03-20 17:40:07.819469148 +0000 UTC m=+1351.277500699" watchObservedRunningTime="2026-03-20 17:40:07.831943451 +0000 UTC m=+1351.289974992" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.852910 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.00154009 podStartE2EDuration="5.852844247s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:04.014824102 +0000 UTC m=+1347.472855643" lastFinishedPulling="2026-03-20 17:40:06.866128259 +0000 UTC m=+1350.324159800" observedRunningTime="2026-03-20 17:40:07.842602455 +0000 UTC m=+1351.300634016" watchObservedRunningTime="2026-03-20 17:40:07.852844247 +0000 UTC m=+1351.310875788" Mar 20 17:40:07 crc kubenswrapper[4795]: I0320 17:40:07.874457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8924514439999998 podStartE2EDuration="5.874436515s" podCreationTimestamp="2026-03-20 17:40:02 +0000 UTC" firstStartedPulling="2026-03-20 17:40:03.878796109 +0000 UTC m=+1347.336827650" lastFinishedPulling="2026-03-20 17:40:06.86078118 +0000 UTC m=+1350.318812721" observedRunningTime="2026-03-20 17:40:07.860750815 +0000 UTC m=+1351.318782366" watchObservedRunningTime="2026-03-20 17:40:07.874436515 +0000 UTC m=+1351.332468066" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.199031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.357921 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.387524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.530865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531183 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") pod \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\" (UID: \"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950\") " Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.531659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs" (OuterVolumeSpecName: "logs") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.537097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6" (OuterVolumeSpecName: "kube-api-access-cjvw6") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "kube-api-access-cjvw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.567818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data" (OuterVolumeSpecName: "config-data") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.577769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" (UID: "78fed1b3-613a-4eb8-bfee-cfc1eb4cb950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634555 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634598 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634614 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjvw6\" (UniqueName: \"kubernetes.io/projected/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-kube-api-access-cjvw6\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.634627 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826935 4795 generic.go:334] "Generic (PLEG): container finished" podID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" exitCode=0 Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826975 4795 generic.go:334] "Generic (PLEG): container finished" podID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" exitCode=143 Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826988 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.826984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerDied","Data":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.827488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerDied","Data":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.827613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"78fed1b3-613a-4eb8-bfee-cfc1eb4cb950","Type":"ContainerDied","Data":"ab84cd82bd3f42e54f99be97caf2de2ed929e55f48d616683e9c8f126716e9cd"} Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.827538 4795 scope.go:117] "RemoveContainer" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.847563 4795 scope.go:117] "RemoveContainer" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.869456 4795 scope.go:117] "RemoveContainer" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.880992 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.881298 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": container with ID starting with 31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893 not found: ID does not exist" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.881380 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} err="failed to get container status \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": rpc error: code = NotFound desc = could not find container \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": container with ID starting with 31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.881435 4795 scope.go:117] "RemoveContainer" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.882107 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": container with ID starting with 989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806 not found: ID does not exist" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.882282 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} err="failed to get container status \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": rpc error: code = NotFound desc = could not find container \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": container with ID starting with 989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.882485 4795 scope.go:117] "RemoveContainer" containerID="31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.894759 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.895271 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893"} err="failed to get container status \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": rpc error: code = NotFound desc = could not find container \"31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893\": container with ID starting with 31a4149f82824522eef3fd4f1bf50c100227be955d1264dfe4bb4151c11c7893 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.895484 4795 scope.go:117] "RemoveContainer" containerID="989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.896243 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806"} err="failed to get container status \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": rpc error: code = NotFound desc = could not find container \"989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806\": container with ID starting with 989627adb99930cf11433d1892639a719e62ffcf4762b2794d850574ca29a806 not found: ID does not exist" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.907732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.908285 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.908395 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.908497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerName="oc" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.908578 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerName="oc" Mar 20 17:40:08 crc kubenswrapper[4795]: E0320 17:40:08.908662 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.908755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.909044 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" containerName="oc" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.909130 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-log" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.909246 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" containerName="nova-metadata-metadata" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.910542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.915487 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.915734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:40:08 crc kubenswrapper[4795]: I0320 17:40:08.917123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049122 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.049278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.151937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.155368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.161175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.161249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.181048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"nova-metadata-0\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.264347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.273289 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78fed1b3-613a-4eb8-bfee-cfc1eb4cb950" path="/var/lib/kubelet/pods/78fed1b3-613a-4eb8-bfee-cfc1eb4cb950/volumes" Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.719078 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:09 crc kubenswrapper[4795]: I0320 17:40:09.839031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerStarted","Data":"c172225cd88a41d11bff28076c479195964bb7b3c9cd3656f88a39cdd6d2adba"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.849395 4795 generic.go:334] "Generic (PLEG): container finished" podID="18794d5c-e43a-44dc-9510-763a31275104" containerID="2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1" exitCode=0 Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.849476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerDied","Data":"2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.852531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerStarted","Data":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.852574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerStarted","Data":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} Mar 20 17:40:10 crc kubenswrapper[4795]: I0320 17:40:10.891126 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.891106126 podStartE2EDuration="2.891106126s" podCreationTimestamp="2026-03-20 17:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:10.889526796 +0000 UTC m=+1354.347558357" watchObservedRunningTime="2026-03-20 17:40:10.891106126 +0000 UTC m=+1354.349137667" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.251271 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.420779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") pod \"18794d5c-e43a-44dc-9510-763a31275104\" (UID: \"18794d5c-e43a-44dc-9510-763a31275104\") " Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.434535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq" (OuterVolumeSpecName: "kube-api-access-fh9qq") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "kube-api-access-fh9qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.434535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts" (OuterVolumeSpecName: "scripts") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.472105 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data" (OuterVolumeSpecName: "config-data") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.474591 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18794d5c-e43a-44dc-9510-763a31275104" (UID: "18794d5c-e43a-44dc-9510-763a31275104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524010 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh9qq\" (UniqueName: \"kubernetes.io/projected/18794d5c-e43a-44dc-9510-763a31275104-kube-api-access-fh9qq\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524062 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524082 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.524102 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18794d5c-e43a-44dc-9510-763a31275104-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.881567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kmgk9" event={"ID":"18794d5c-e43a-44dc-9510-763a31275104","Type":"ContainerDied","Data":"221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf"} Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.881639 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="221a2a1b3ef429426dd11b646846e5dea7947dde1ab1f73c5366b50c2ad171bf" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.881813 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kmgk9" Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.888750 4795 generic.go:334] "Generic (PLEG): container finished" podID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerID="de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c" exitCode=0 Mar 20 17:40:12 crc kubenswrapper[4795]: I0320 17:40:12.888756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerDied","Data":"de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.124850 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.125361 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" containerID="cri-o://753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.125434 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" containerID="cri-o://50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.140442 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.140677 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" containerID="cri-o://87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.164304 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.164558 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" containerID="cri-o://a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.164657 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" containerID="cri-o://b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" gracePeriod=30 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.405090 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.489804 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.697838 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.703814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") pod \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\" (UID: \"f0082b8a-cf10-4449-a93f-b0c79e10e2d0\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs" (OuterVolumeSpecName: "logs") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856666 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") pod \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\" (UID: \"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8\") " Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.856850 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs" (OuterVolumeSpecName: "logs") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.857151 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.857169 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.862078 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2" (OuterVolumeSpecName: "kube-api-access-j9vf2") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "kube-api-access-j9vf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.866955 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw" (OuterVolumeSpecName: "kube-api-access-d2dkw") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "kube-api-access-d2dkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.888182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data" (OuterVolumeSpecName: "config-data") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.894874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0082b8a-cf10-4449-a93f-b0c79e10e2d0" (UID: "f0082b8a-cf10-4449-a93f-b0c79e10e2d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.901189 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data" (OuterVolumeSpecName: "config-data") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.903273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919311 4795 generic.go:334] "Generic (PLEG): container finished" podID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" exitCode=0 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919336 4795 generic.go:334] "Generic (PLEG): container finished" podID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" exitCode=143 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919425 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.919824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerDied","Data":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.920023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerDied","Data":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.920048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f0082b8a-cf10-4449-a93f-b0c79e10e2d0","Type":"ContainerDied","Data":"c016e3406d811f115716e682934de46c5a980e71042e6b56bc4a4a96322456f5"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.920065 4795 scope.go:117] "RemoveContainer" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921183 4795 generic.go:334] "Generic (PLEG): container finished" podID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" exitCode=0 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921212 4795 generic.go:334] "Generic (PLEG): container finished" podID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" exitCode=143 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerDied","Data":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerDied","Data":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8","Type":"ContainerDied","Data":"c172225cd88a41d11bff28076c479195964bb7b3c9cd3656f88a39cdd6d2adba"} Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.921698 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" containerID="cri-o://154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" gracePeriod=10 Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.926741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" (UID: "5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.949889 4795 scope.go:117] "RemoveContainer" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.950081 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967599 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967637 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967650 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9vf2\" (UniqueName: \"kubernetes.io/projected/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-kube-api-access-j9vf2\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967661 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0082b8a-cf10-4449-a93f-b0c79e10e2d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967671 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967681 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2dkw\" (UniqueName: \"kubernetes.io/projected/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-kube-api-access-d2dkw\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:13 crc kubenswrapper[4795]: I0320 17:40:13.967715 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.011061 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.027162 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.028062 4795 scope.go:117] "RemoveContainer" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.030240 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": container with ID starting with 50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49 not found: ID does not exist" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.030313 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} err="failed to get container status \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": rpc error: code = NotFound desc = could not find container \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": container with ID starting with 50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.030365 4795 scope.go:117] "RemoveContainer" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.034462 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": container with ID starting with 753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37 not found: ID does not exist" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.034514 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} err="failed to get container status \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": rpc error: code = NotFound desc = could not find container \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": container with ID starting with 753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.034548 4795 scope.go:117] "RemoveContainer" containerID="50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035039 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49"} err="failed to get container status \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": rpc error: code = NotFound desc = could not find container \"50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49\": container with ID starting with 50db8832a793ace363d8d52828edb98caa3f7b22baf4dff0f68aaea76004fc49 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035073 4795 scope.go:117] "RemoveContainer" containerID="753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035480 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37"} err="failed to get container status \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": rpc error: code = NotFound desc = could not find container \"753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37\": container with ID starting with 753f00da2ec62a41201e8ab0bf0f9d5a6014b2dacd2f17c9599b134f97aa9d37 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.035503 4795 scope.go:117] "RemoveContainer" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038727 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038742 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038763 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038771 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038800 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038806 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038824 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18794d5c-e43a-44dc-9510-763a31275104" containerName="nova-manage" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038829 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="18794d5c-e43a-44dc-9510-763a31275104" containerName="nova-manage" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.038837 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.038842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039012 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-metadata" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039026 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039034 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" containerName="nova-metadata-log" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="18794d5c-e43a-44dc-9510-763a31275104" containerName="nova-manage" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039051 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" containerName="nova-api-api" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.039922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.045576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.050480 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.092529 4795 scope.go:117] "RemoveContainer" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.121259 4795 scope.go:117] "RemoveContainer" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.123538 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": container with ID starting with b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93 not found: ID does not exist" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.123593 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} err="failed to get container status \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": rpc error: code = NotFound desc = could not find container \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": container with ID starting with b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.123620 4795 scope.go:117] "RemoveContainer" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.124218 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": container with ID starting with a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968 not found: ID does not exist" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124254 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} err="failed to get container status \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": rpc error: code = NotFound desc = could not find container \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": container with ID starting with a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124277 4795 scope.go:117] "RemoveContainer" containerID="b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124643 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93"} err="failed to get container status \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": rpc error: code = NotFound desc = could not find container \"b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93\": container with ID starting with b6592c8cdf3227dc6b81fcc81b94c82812f61932b5e5c745ee6b1222c6736b93 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124679 4795 scope.go:117] "RemoveContainer" containerID="a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.124911 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968"} err="failed to get container status \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": rpc error: code = NotFound desc = could not find container \"a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968\": container with ID starting with a5d7ad4138cf98646546a0027fc06d802c216be97f9accb54ef6cfe674f16968 not found: ID does not exist" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.171997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.172042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.172073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.172114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.275831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.277759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.282286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.283017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.292134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"nova-api-0\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.357560 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.357785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.361735 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.378006 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.388952 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: E0320 17:40:14.389431 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerName="nova-cell1-conductor-db-sync" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.389444 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerName="nova-cell1-conductor-db-sync" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.389600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" containerName="nova-cell1-conductor-db-sync" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395123 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.395562 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.406564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.409178 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.410815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts" (OuterVolumeSpecName: "scripts") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.411033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.411456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") pod \"4f61db3a-a7de-495d-8305-b9e2910415e2\" (UID: \"4f61db3a-a7de-495d-8305-b9e2910415e2\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.496943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497103 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.497974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.500804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.512068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.515960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65" (OuterVolumeSpecName: "kube-api-access-cvf65") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "kube-api-access-cvf65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.516463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.522324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"nova-metadata-0\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.543890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.546198 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data" (OuterVolumeSpecName: "config-data") pod "4f61db3a-a7de-495d-8305-b9e2910415e2" (UID: "4f61db3a-a7de-495d-8305-b9e2910415e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.576465 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.586223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.605698 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvf65\" (UniqueName: \"kubernetes.io/projected/4f61db3a-a7de-495d-8305-b9e2910415e2-kube-api-access-cvf65\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.605731 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.605742 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f61db3a-a7de-495d-8305-b9e2910415e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710140 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710169 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.710251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") pod \"da0f84b3-294d-455f-89e7-1c8f8439a837\" (UID: \"da0f84b3-294d-455f-89e7-1c8f8439a837\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.720872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w" (OuterVolumeSpecName: "kube-api-access-92c6w") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "kube-api-access-92c6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.756889 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.782215 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config" (OuterVolumeSpecName: "config") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.784615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.796634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.801376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da0f84b3-294d-455f-89e7-1c8f8439a837" (UID: "da0f84b3-294d-455f-89e7-1c8f8439a837"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812588 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812634 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812645 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812655 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92c6w\" (UniqueName: \"kubernetes.io/projected/da0f84b3-294d-455f-89e7-1c8f8439a837-kube-api-access-92c6w\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812665 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.812675 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da0f84b3-294d-455f-89e7-1c8f8439a837-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.818808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.873893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.913543 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") pod \"b2f35863-4f45-43d5-b600-9028b32195d7\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.913715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") pod \"b2f35863-4f45-43d5-b600-9028b32195d7\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.913845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") pod \"b2f35863-4f45-43d5-b600-9028b32195d7\" (UID: \"b2f35863-4f45-43d5-b600-9028b32195d7\") " Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.917878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt" (OuterVolumeSpecName: "kube-api-access-sbkkt") pod "b2f35863-4f45-43d5-b600-9028b32195d7" (UID: "b2f35863-4f45-43d5-b600-9028b32195d7"). InnerVolumeSpecName "kube-api-access-sbkkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.940455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerStarted","Data":"40ebeaf246d0901fcc00ff42264c8849595abc6aa664bf61e7a95863c72633fd"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.941203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2f35863-4f45-43d5-b600-9028b32195d7" (UID: "b2f35863-4f45-43d5-b600-9028b32195d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.941959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" event={"ID":"4f61db3a-a7de-495d-8305-b9e2910415e2","Type":"ContainerDied","Data":"be403fd93662530286fc5651363ea195fe87d753696088488bd47663f1933769"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.942001 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be403fd93662530286fc5651363ea195fe87d753696088488bd47663f1933769" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.942074 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcxtg" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.949076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data" (OuterVolumeSpecName: "config-data") pod "b2f35863-4f45-43d5-b600-9028b32195d7" (UID: "b2f35863-4f45-43d5-b600-9028b32195d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952550 4795 generic.go:334] "Generic (PLEG): container finished" podID="b2f35863-4f45-43d5-b600-9028b32195d7" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" exitCode=0 Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952644 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerDied","Data":"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2f35863-4f45-43d5-b600-9028b32195d7","Type":"ContainerDied","Data":"a81953c7eff097e8b8de2cddd252282ad6966a7afa286fafb4eb334123de90a3"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.952785 4795 scope.go:117] "RemoveContainer" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958210 4795 generic.go:334] "Generic (PLEG): container finished" podID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" exitCode=0 Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerDied","Data":"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93"} Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958318 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" Mar 20 17:40:14 crc kubenswrapper[4795]: I0320 17:40:14.958324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-s4k94" event={"ID":"da0f84b3-294d-455f-89e7-1c8f8439a837","Type":"ContainerDied","Data":"d7ccdefaaa93b0e48b444bfb331ca4591ab4806568e7a9f1ee5df6eaa4ff29c6"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.000128 4795 scope.go:117] "RemoveContainer" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.000575 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517\": container with ID starting with 87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517 not found: ID does not exist" containerID="87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.000609 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517"} err="failed to get container status \"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517\": rpc error: code = NotFound desc = could not find container \"87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517\": container with ID starting with 87c12d527ed81da74967e3adc3fe1c80dd57ef1434ce129713566f58e0a54517 not found: ID does not exist" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.000635 4795 scope.go:117] "RemoveContainer" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016199 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016224 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkkt\" (UniqueName: \"kubernetes.io/projected/b2f35863-4f45-43d5-b600-9028b32195d7-kube-api-access-sbkkt\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016233 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2f35863-4f45-43d5-b600-9028b32195d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.016779 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.017269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017290 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.017317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="init" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="init" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.017353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017363 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017592 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" containerName="dnsmasq-dns" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.017618 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" containerName="nova-scheduler-scheduler" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.018375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.037754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.039017 4795 scope.go:117] "RemoveContainer" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.039694 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.050400 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.068637 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.101431 4795 scope.go:117] "RemoveContainer" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.102343 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93\": container with ID starting with 154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93 not found: ID does not exist" containerID="154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.102373 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93"} err="failed to get container status \"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93\": rpc error: code = NotFound desc = could not find container \"154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93\": container with ID starting with 154eb2953d807e2f82f20404afae0eae5594b39c5f209ae3d9a78d62478c9e93 not found: ID does not exist" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.102398 4795 scope.go:117] "RemoveContainer" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" Mar 20 17:40:15 crc kubenswrapper[4795]: E0320 17:40:15.103177 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a\": container with ID starting with 7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a not found: ID does not exist" containerID="7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.103202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a"} err="failed to get container status \"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a\": rpc error: code = NotFound desc = could not find container \"7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a\": container with ID starting with 7bdce7e1708091c7cf30a8cd9ec7836d4e5cd71a97abc2f133dae0543b899d4a not found: ID does not exist" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.106751 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.119793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.119846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28wdn\" (UniqueName: \"kubernetes.io/projected/19c15c93-572c-4d53-b924-172f3ad29c8a-kube-api-access-28wdn\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.119982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.130762 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.131947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.135085 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.142308 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-s4k94"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.163747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.177934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28wdn\" (UniqueName: \"kubernetes.io/projected/19c15c93-572c-4d53-b924-172f3ad29c8a-kube-api-access-28wdn\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221663 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.221806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.225332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.228325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c15c93-572c-4d53-b924-172f3ad29c8a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.241257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28wdn\" (UniqueName: \"kubernetes.io/projected/19c15c93-572c-4d53-b924-172f3ad29c8a-kube-api-access-28wdn\") pod \"nova-cell1-conductor-0\" (UID: \"19c15c93-572c-4d53-b924-172f3ad29c8a\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.264677 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8" path="/var/lib/kubelet/pods/5dd3cfd4-62eb-4f9b-bb61-e981e20d3bb8/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.266202 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f35863-4f45-43d5-b600-9028b32195d7" path="/var/lib/kubelet/pods/b2f35863-4f45-43d5-b600-9028b32195d7/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.266947 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0f84b3-294d-455f-89e7-1c8f8439a837" path="/var/lib/kubelet/pods/da0f84b3-294d-455f-89e7-1c8f8439a837/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.267663 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0082b8a-cf10-4449-a93f-b0c79e10e2d0" path="/var/lib/kubelet/pods/f0082b8a-cf10-4449-a93f-b0c79e10e2d0/volumes" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.324096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.324276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.324364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.330130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.330518 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.341232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"nova-scheduler-0\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.367856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.468659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.793281 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.931186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:15 crc kubenswrapper[4795]: W0320 17:40:15.933050 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef24c878_faa8_4a0b_a303_951d0a457eef.slice/crio-84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c WatchSource:0}: Error finding container 84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c: Status 404 returned error can't find the container with id 84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.979469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerStarted","Data":"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.979508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerStarted","Data":"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.979518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerStarted","Data":"468acb8849abdd77a144584691b96b8cbadeae923d66dd538f230c0aee8d52cb"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.982257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19c15c93-572c-4d53-b924-172f3ad29c8a","Type":"ContainerStarted","Data":"2d7a81c2145eeccbda9da96a1c6112a931d61cce32b35c2cb50ce493055e207b"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.984699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerStarted","Data":"84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.987113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerStarted","Data":"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.987138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerStarted","Data":"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843"} Mar 20 17:40:15 crc kubenswrapper[4795]: I0320 17:40:15.994907 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.994889277 podStartE2EDuration="1.994889277s" podCreationTimestamp="2026-03-20 17:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:15.992964725 +0000 UTC m=+1359.450996286" watchObservedRunningTime="2026-03-20 17:40:15.994889277 +0000 UTC m=+1359.452920828" Mar 20 17:40:16 crc kubenswrapper[4795]: I0320 17:40:16.022510 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.022491244 podStartE2EDuration="2.022491244s" podCreationTimestamp="2026-03-20 17:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:16.010721094 +0000 UTC m=+1359.468752655" watchObservedRunningTime="2026-03-20 17:40:16.022491244 +0000 UTC m=+1359.480522805" Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.000966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"19c15c93-572c-4d53-b924-172f3ad29c8a","Type":"ContainerStarted","Data":"50aadf2db6761032903224e02e2fdd62cf9b757415bac79531147b03ed93db54"} Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.001189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.003217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerStarted","Data":"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463"} Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.028170 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.02815522 podStartE2EDuration="3.02815522s" podCreationTimestamp="2026-03-20 17:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:17.026121937 +0000 UTC m=+1360.484153508" watchObservedRunningTime="2026-03-20 17:40:17.02815522 +0000 UTC m=+1360.486186761" Mar 20 17:40:17 crc kubenswrapper[4795]: I0320 17:40:17.049410 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.049394538 podStartE2EDuration="2.049394538s" podCreationTimestamp="2026-03-20 17:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:17.046894722 +0000 UTC m=+1360.504926263" watchObservedRunningTime="2026-03-20 17:40:17.049394538 +0000 UTC m=+1360.507426079" Mar 20 17:40:20 crc kubenswrapper[4795]: I0320 17:40:20.470133 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:40:20 crc kubenswrapper[4795]: I0320 17:40:20.963161 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.359223 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.359554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.496845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.497127 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" containerID="cri-o://19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" gracePeriod=30 Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.586560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:40:24 crc kubenswrapper[4795]: I0320 17:40:24.586871 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.035823 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105779 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" exitCode=2 Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerDied","Data":"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802"} Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8","Type":"ContainerDied","Data":"715534f72ece852c083764840657cce952ec7708ddcedcd00af2caddc251418f"} Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.105861 4795 scope.go:117] "RemoveContainer" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.106013 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.137371 4795 scope.go:117] "RemoveContainer" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" Mar 20 17:40:25 crc kubenswrapper[4795]: E0320 17:40:25.137852 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802\": container with ID starting with 19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802 not found: ID does not exist" containerID="19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.137883 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802"} err="failed to get container status \"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802\": rpc error: code = NotFound desc = could not find container \"19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802\": container with ID starting with 19da13a276e823cac4e5c0a548e41f17be0aa4ce823d7cf2de0a4d76f5afb802 not found: ID does not exist" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.224442 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") pod \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\" (UID: \"5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8\") " Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.252969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp" (OuterVolumeSpecName: "kube-api-access-5k6xp") pod "5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" (UID: "5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8"). InnerVolumeSpecName "kube-api-access-5k6xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.327436 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k6xp\" (UniqueName: \"kubernetes.io/projected/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8-kube-api-access-5k6xp\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.396242 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.400283 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.441664 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.469868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.470492 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.479458 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.506239 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: E0320 17:40:25.506613 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.506624 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.506787 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" containerName="kube-state-metrics" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.507324 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.511824 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.511997 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.513482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.529729 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.596864 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.596899 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwkj\" (UniqueName: \"kubernetes.io/projected/72605c7d-99df-450f-900b-3022b0520149-kube-api-access-jlwkj\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.632154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.733503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwkj\" (UniqueName: \"kubernetes.io/projected/72605c7d-99df-450f-900b-3022b0520149-kube-api-access-jlwkj\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.733558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.733626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.734343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.747856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.747920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.748129 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/72605c7d-99df-450f-900b-3022b0520149-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.753548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwkj\" (UniqueName: \"kubernetes.io/projected/72605c7d-99df-450f-900b-3022b0520149-kube-api-access-jlwkj\") pod \"kube-state-metrics-0\" (UID: \"72605c7d-99df-450f-900b-3022b0520149\") " pod="openstack/kube-state-metrics-0" Mar 20 17:40:25 crc kubenswrapper[4795]: I0320 17:40:25.824607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.169302 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.294183 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:40:26 crc kubenswrapper[4795]: W0320 17:40:26.294911 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72605c7d_99df_450f_900b_3022b0520149.slice/crio-6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8 WatchSource:0}: Error finding container 6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8: Status 404 returned error can't find the container with id 6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478038 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478306 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" containerID="cri-o://ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" gracePeriod=30 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478365 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" containerID="cri-o://1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" gracePeriod=30 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478411 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" containerID="cri-o://a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" gracePeriod=30 Mar 20 17:40:26 crc kubenswrapper[4795]: I0320 17:40:26.478411 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" containerID="cri-o://f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" gracePeriod=30 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149459 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" exitCode=0 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149769 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" exitCode=2 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149784 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" exitCode=0 Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.149875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.151628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72605c7d-99df-450f-900b-3022b0520149","Type":"ContainerStarted","Data":"ded6295ee3fa941441f90f6f2bd86fc156736a39b08b900c51df152156208d21"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.151675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"72605c7d-99df-450f-900b-3022b0520149","Type":"ContainerStarted","Data":"6533b348b3ad7d46fa777cba6c7d28ba981a053a4077c8c602919ce79344e2d8"} Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.171039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.808220489 podStartE2EDuration="2.171018731s" podCreationTimestamp="2026-03-20 17:40:25 +0000 UTC" firstStartedPulling="2026-03-20 17:40:26.297084279 +0000 UTC m=+1369.755115820" lastFinishedPulling="2026-03-20 17:40:26.659882521 +0000 UTC m=+1370.117914062" observedRunningTime="2026-03-20 17:40:27.166022449 +0000 UTC m=+1370.624053990" watchObservedRunningTime="2026-03-20 17:40:27.171018731 +0000 UTC m=+1370.629050282" Mar 20 17:40:27 crc kubenswrapper[4795]: I0320 17:40:27.269342 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8" path="/var/lib/kubelet/pods/5e0ef4d8-84f6-44a1-bb00-a66f3c7fbdf8/volumes" Mar 20 17:40:28 crc kubenswrapper[4795]: I0320 17:40:28.160741 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.090111 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.161807 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") pod \"eb437f62-80bf-465d-85cf-12348aba1514\" (UID: \"eb437f62-80bf-465d-85cf-12348aba1514\") " Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.162885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.162419 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.170953 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts" (OuterVolumeSpecName: "scripts") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.170989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z" (OuterVolumeSpecName: "kube-api-access-bll6z") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "kube-api-access-bll6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.201960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208668 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c"} Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208681 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb437f62-80bf-465d-85cf-12348aba1514" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" exitCode=0 Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb437f62-80bf-465d-85cf-12348aba1514","Type":"ContainerDied","Data":"39c11fccd9e673059022bf047af401ca209830155fcf251e8d72aeeb8fa6e0d2"} Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.208876 4795 scope.go:117] "RemoveContainer" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.242094 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.263556 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.264447 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265308 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265386 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb437f62-80bf-465d-85cf-12348aba1514-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265455 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.265510 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bll6z\" (UniqueName: \"kubernetes.io/projected/eb437f62-80bf-465d-85cf-12348aba1514-kube-api-access-bll6z\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.271405 4795 scope.go:117] "RemoveContainer" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.280235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data" (OuterVolumeSpecName: "config-data") pod "eb437f62-80bf-465d-85cf-12348aba1514" (UID: "eb437f62-80bf-465d-85cf-12348aba1514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.291259 4795 scope.go:117] "RemoveContainer" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.311065 4795 scope.go:117] "RemoveContainer" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.328536 4795 scope.go:117] "RemoveContainer" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.328936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f\": container with ID starting with 1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f not found: ID does not exist" containerID="1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.328980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f"} err="failed to get container status \"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f\": rpc error: code = NotFound desc = could not find container \"1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f\": container with ID starting with 1e21f702a56a5dfc52020ac49959591790871c116d0c9e4602aa40cd4c90734f not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329005 4795 scope.go:117] "RemoveContainer" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.329337 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc\": container with ID starting with a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc not found: ID does not exist" containerID="a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329367 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc"} err="failed to get container status \"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc\": rpc error: code = NotFound desc = could not find container \"a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc\": container with ID starting with a549db2fef32d99028f51d245c4786ed957b75c50df1dce4ad289bc5613db3dc not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329388 4795 scope.go:117] "RemoveContainer" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.329640 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c\": container with ID starting with f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c not found: ID does not exist" containerID="f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329664 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c"} err="failed to get container status \"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c\": rpc error: code = NotFound desc = could not find container \"f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c\": container with ID starting with f78705e1cf3362143ad1d94a36908df425b3541d33b0f5a3c2b0a14a46adc16c not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329680 4795 scope.go:117] "RemoveContainer" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.329901 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1\": container with ID starting with ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1 not found: ID does not exist" containerID="ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.329924 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1"} err="failed to get container status \"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1\": rpc error: code = NotFound desc = could not find container \"ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1\": container with ID starting with ee22749f69e48c7b1c6c3936b2ecd86dbb2f288452d9ff439fbe6d001aebffd1 not found: ID does not exist" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.358639 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.358704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.367321 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb437f62-80bf-465d-85cf-12348aba1514-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.547945 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.557580 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.585723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586227 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586243 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586275 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586286 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586307 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586316 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: E0320 17:40:32.586329 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586548 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-notification-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586568 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="sg-core" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586579 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="proxy-httpd" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.586600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb437f62-80bf-465d-85cf-12348aba1514" containerName="ceilometer-central-agent" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.588970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.590292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.591365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.593101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.640826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.641031 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.641029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.672815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.672865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.672951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673484 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.673902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.775878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.775946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.776792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.781392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.781408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.781768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.782148 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.783310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.799917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"ceilometer-0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " pod="openstack/ceilometer-0" Mar 20 17:40:32 crc kubenswrapper[4795]: I0320 17:40:32.944513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:33 crc kubenswrapper[4795]: I0320 17:40:33.262335 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb437f62-80bf-465d-85cf-12348aba1514" path="/var/lib/kubelet/pods/eb437f62-80bf-465d-85cf-12348aba1514/volumes" Mar 20 17:40:33 crc kubenswrapper[4795]: I0320 17:40:33.402454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:33 crc kubenswrapper[4795]: W0320 17:40:33.409665 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15f36e1_3fd7_43bc_9aaa_d793c6a43fd0.slice/crio-a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe WatchSource:0}: Error finding container a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe: Status 404 returned error can't find the container with id a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.232335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe"} Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.365012 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.365826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.375779 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.591491 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.600469 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:40:34 crc kubenswrapper[4795]: I0320 17:40:34.606378 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.263970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.264243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.264264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.273301 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.478964 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.488493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.499495 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.528827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.632490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.633584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.634339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.635203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.636019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.637346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.660039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"dnsmasq-dns-89c5cd4d5-g7pv7\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.820155 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:35 crc kubenswrapper[4795]: I0320 17:40:35.843964 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:40:36 crc kubenswrapper[4795]: I0320 17:40:36.267899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} Mar 20 17:40:36 crc kubenswrapper[4795]: I0320 17:40:36.380278 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.308702 4795 generic.go:334] "Generic (PLEG): container finished" podID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" exitCode=0 Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.308906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerDied","Data":"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087"} Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.309847 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerStarted","Data":"54687987edb9f2765e8d4f7b8bfef3664f1024d2c67848e40765c69ff1c22cea"} Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.608835 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:37 crc kubenswrapper[4795]: I0320 17:40:37.901013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.328455 4795 generic.go:334] "Generic (PLEG): container finished" podID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerID="65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7" exitCode=137 Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.328494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerDied","Data":"65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7"} Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.330332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerStarted","Data":"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8"} Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.330507 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" containerID="cri-o://9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" gracePeriod=30 Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.330579 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" containerID="cri-o://0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" gracePeriod=30 Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.364621 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" podStartSLOduration=3.36459957 podStartE2EDuration="3.36459957s" podCreationTimestamp="2026-03-20 17:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:38.360737023 +0000 UTC m=+1381.818768584" watchObservedRunningTime="2026-03-20 17:40:38.36459957 +0000 UTC m=+1381.822631111" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.386114 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.586173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") pod \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.586480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") pod \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.586683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") pod \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\" (UID: \"74d437e5-b643-4a6f-a9d9-50cf8166d0af\") " Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.591427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr" (OuterVolumeSpecName: "kube-api-access-xbfmr") pod "74d437e5-b643-4a6f-a9d9-50cf8166d0af" (UID: "74d437e5-b643-4a6f-a9d9-50cf8166d0af"). InnerVolumeSpecName "kube-api-access-xbfmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.629616 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data" (OuterVolumeSpecName: "config-data") pod "74d437e5-b643-4a6f-a9d9-50cf8166d0af" (UID: "74d437e5-b643-4a6f-a9d9-50cf8166d0af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.648748 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74d437e5-b643-4a6f-a9d9-50cf8166d0af" (UID: "74d437e5-b643-4a6f-a9d9-50cf8166d0af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.688910 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.688945 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbfmr\" (UniqueName: \"kubernetes.io/projected/74d437e5-b643-4a6f-a9d9-50cf8166d0af-kube-api-access-xbfmr\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:38 crc kubenswrapper[4795]: I0320 17:40:38.688955 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d437e5-b643-4a6f-a9d9-50cf8166d0af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.339460 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff123956-68b6-4a60-ac22-1972b9554205" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" exitCode=143 Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.339525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerDied","Data":"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843"} Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74d437e5-b643-4a6f-a9d9-50cf8166d0af","Type":"ContainerDied","Data":"a6f7fa5a8cd4bb19f7f275862231fd5657f29dc6b3da296fa035e77f5df8d7d5"} Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341187 4795 scope.go:117] "RemoveContainer" containerID="65f55a27fa8e508e44d5a8d1bb44f105a57a43fd7b8f29f73e4c9d5944daa0d7" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.341417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.367479 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.390304 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.406569 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: E0320 17:40:39.407119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.407144 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.407498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.408523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.414334 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.414428 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.414689 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.417095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.606955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607281 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.607683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qhs\" (UniqueName: \"kubernetes.io/projected/d2a5e398-6d25-43b1-8c29-407af2d9348b-kube-api-access-g5qhs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qhs\" (UniqueName: \"kubernetes.io/projected/d2a5e398-6d25-43b1-8c29-407af2d9348b-kube-api-access-g5qhs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.708639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.713579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.713589 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.713742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.718147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a5e398-6d25-43b1-8c29-407af2d9348b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.729985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qhs\" (UniqueName: \"kubernetes.io/projected/d2a5e398-6d25-43b1-8c29-407af2d9348b-kube-api-access-g5qhs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d2a5e398-6d25-43b1-8c29-407af2d9348b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:39 crc kubenswrapper[4795]: I0320 17:40:39.731670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:40 crc kubenswrapper[4795]: I0320 17:40:40.211515 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:40:40 crc kubenswrapper[4795]: I0320 17:40:40.357601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d2a5e398-6d25-43b1-8c29-407af2d9348b","Type":"ContainerStarted","Data":"95807406a254da5fe2ae43dc10f1b3b1a4186d1ac881af24446bccfb5750f60d"} Mar 20 17:40:41 crc kubenswrapper[4795]: I0320 17:40:41.265184 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d437e5-b643-4a6f-a9d9-50cf8166d0af" path="/var/lib/kubelet/pods/74d437e5-b643-4a6f-a9d9-50cf8166d0af/volumes" Mar 20 17:40:41 crc kubenswrapper[4795]: I0320 17:40:41.427105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d2a5e398-6d25-43b1-8c29-407af2d9348b","Type":"ContainerStarted","Data":"1360ebb15bfb90c2fbb3a893d08d00218917f150a6ae9f359d41fcd3ab50f8b6"} Mar 20 17:40:41 crc kubenswrapper[4795]: I0320 17:40:41.473100 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.473087252 podStartE2EDuration="2.473087252s" podCreationTimestamp="2026-03-20 17:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:41.468083699 +0000 UTC m=+1384.926115230" watchObservedRunningTime="2026-03-20 17:40:41.473087252 +0000 UTC m=+1384.931118793" Mar 20 17:40:41 crc kubenswrapper[4795]: E0320 17:40:41.745995 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff123956_68b6_4a60_ac22_1972b9554205.slice/crio-0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff123956_68b6_4a60_ac22_1972b9554205.slice/crio-conmon-0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.014553 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.160645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.160734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.160804 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.161034 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") pod \"ff123956-68b6-4a60-ac22-1972b9554205\" (UID: \"ff123956-68b6-4a60-ac22-1972b9554205\") " Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.161878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs" (OuterVolumeSpecName: "logs") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.167235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp" (OuterVolumeSpecName: "kube-api-access-86gkp") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "kube-api-access-86gkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.195792 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data" (OuterVolumeSpecName: "config-data") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.210528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff123956-68b6-4a60-ac22-1972b9554205" (UID: "ff123956-68b6-4a60-ac22-1972b9554205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263121 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263144 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff123956-68b6-4a60-ac22-1972b9554205-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263155 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86gkp\" (UniqueName: \"kubernetes.io/projected/ff123956-68b6-4a60-ac22-1972b9554205-kube-api-access-86gkp\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.263164 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff123956-68b6-4a60-ac22-1972b9554205-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.437773 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff123956-68b6-4a60-ac22-1972b9554205" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" exitCode=0 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.437855 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.437867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerDied","Data":"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5"} Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.438224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff123956-68b6-4a60-ac22-1972b9554205","Type":"ContainerDied","Data":"40ebeaf246d0901fcc00ff42264c8849595abc6aa664bf61e7a95863c72633fd"} Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.438242 4795 scope.go:117] "RemoveContainer" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.440977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerStarted","Data":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441144 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" containerID="cri-o://c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441149 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" containerID="cri-o://02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441210 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" containerID="cri-o://fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.441244 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" containerID="cri-o://becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" gracePeriod=30 Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.467584 4795 scope.go:117] "RemoveContainer" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.483999 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.262729271 podStartE2EDuration="10.483977345s" podCreationTimestamp="2026-03-20 17:40:32 +0000 UTC" firstStartedPulling="2026-03-20 17:40:33.412033367 +0000 UTC m=+1376.870064908" lastFinishedPulling="2026-03-20 17:40:41.633281441 +0000 UTC m=+1385.091312982" observedRunningTime="2026-03-20 17:40:42.464767168 +0000 UTC m=+1385.922798709" watchObservedRunningTime="2026-03-20 17:40:42.483977345 +0000 UTC m=+1385.942008886" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.487433 4795 scope.go:117] "RemoveContainer" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.487885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5\": container with ID starting with 0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5 not found: ID does not exist" containerID="0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.487924 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5"} err="failed to get container status \"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5\": rpc error: code = NotFound desc = could not find container \"0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5\": container with ID starting with 0dfefe624358bd9333c75e1a1527a62d27b13d80f89a6130296f2a04718908a5 not found: ID does not exist" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.487951 4795 scope.go:117] "RemoveContainer" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.488235 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843\": container with ID starting with 9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843 not found: ID does not exist" containerID="9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.488284 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843"} err="failed to get container status \"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843\": rpc error: code = NotFound desc = could not find container \"9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843\": container with ID starting with 9fed84bb6533339d62b8824a7c311c27d2193d9c0ea583c9a8525d2da89cd843 not found: ID does not exist" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.495491 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.503848 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511008 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.511481 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511504 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" Mar 20 17:40:42 crc kubenswrapper[4795]: E0320 17:40:42.511526 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511533 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511803 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-api" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.511825 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff123956-68b6-4a60-ac22-1972b9554205" containerName="nova-api-log" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.512878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.516202 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.517199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.517939 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.518297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670170 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.670404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.772705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.773182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.778057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.778585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.779938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.780112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.794137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"nova-api-0\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " pod="openstack/nova-api-0" Mar 20 17:40:42 crc kubenswrapper[4795]: I0320 17:40:42.827230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.108166 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.263677 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff123956-68b6-4a60-ac22-1972b9554205" path="/var/lib/kubelet/pods/ff123956-68b6-4a60-ac22-1972b9554205/volumes" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.281953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282312 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") pod \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\" (UID: \"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0\") " Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.282992 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.283068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.287446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62" (OuterVolumeSpecName: "kube-api-access-pvv62") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "kube-api-access-pvv62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.288819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts" (OuterVolumeSpecName: "scripts") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.311364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.350916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: W0320 17:40:43.357898 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7f1fae_ee02_4e5c_a06c_9cfacbdc5207.slice/crio-587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47 WatchSource:0}: Error finding container 587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47: Status 404 returned error can't find the container with id 587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.360474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.381752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385007 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvv62\" (UniqueName: \"kubernetes.io/projected/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-kube-api-access-pvv62\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385046 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385797 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385810 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385819 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385827 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.385844 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.398158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data" (OuterVolumeSpecName: "config-data") pod "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" (UID: "e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461093 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" exitCode=0 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461131 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" exitCode=2 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461141 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" exitCode=0 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461153 4795 generic.go:334] "Generic (PLEG): container finished" podID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" exitCode=0 Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461157 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461293 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.461299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0","Type":"ContainerDied","Data":"a5462b7d9ee6668ab2f74800a7d8242e55e688c0522640f520f27564c8696ebe"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.462374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerStarted","Data":"587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47"} Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.491503 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.504739 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.521475 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.533387 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.542877 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543255 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543271 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543297 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543319 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543325 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.543337 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543343 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543511 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-notification-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543528 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="sg-core" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="ceilometer-central-agent" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.543553 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" containerName="proxy-httpd" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.545355 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.546536 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.548345 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.548516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.548651 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.565058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.596454 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.621593 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.622152 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622184 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622206 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.622526 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622578 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.622613 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.623057 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623085 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623102 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: E0320 17:40:43.623337 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623358 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623370 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623555 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623576 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623798 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.623818 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624034 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624199 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624221 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624477 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624752 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624775 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624942 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.624964 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625157 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625175 4795 scope.go:117] "RemoveContainer" containerID="02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625361 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889"} err="failed to get container status \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": rpc error: code = NotFound desc = could not find container \"02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889\": container with ID starting with 02006a3864a87b73be1a0418cf8f601b244c9a6fb357a9804c06d976a90bc889 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625378 4795 scope.go:117] "RemoveContainer" containerID="fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93"} err="failed to get container status \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": rpc error: code = NotFound desc = could not find container \"fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93\": container with ID starting with fe92eaa4f3c70288db22904bd2f224240a1b63c5c1c9fd8b5f5a487541482e93 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625559 4795 scope.go:117] "RemoveContainer" containerID="becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625720 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7"} err="failed to get container status \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": rpc error: code = NotFound desc = could not find container \"becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7\": container with ID starting with becfe9988b038b47c29fa57e834f6c2a1fa64dc64a5c0c98dc4b5a849f5a71b7 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.625738 4795 scope.go:117] "RemoveContainer" containerID="c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.626295 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8"} err="failed to get container status \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": rpc error: code = NotFound desc = could not find container \"c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8\": container with ID starting with c37c062f9373901805d4712e22ef8f7d527c15c56a4f2323a4c1e39a881f61b8 not found: ID does not exist" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8sl\" (UniqueName: \"kubernetes.io/projected/81c4fa02-a2cf-4349-afe3-292e38b50e33-kube-api-access-5v8sl\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-log-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.694997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-config-data\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.695033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-run-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.695081 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.695202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-scripts\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-run-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-scripts\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8sl\" (UniqueName: \"kubernetes.io/projected/81c4fa02-a2cf-4349-afe3-292e38b50e33-kube-api-access-5v8sl\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.796845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-log-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-run-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/81c4fa02-a2cf-4349-afe3-292e38b50e33-log-httpd\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.797828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-config-data\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.804124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.807165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.807225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.812386 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-scripts\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.813872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81c4fa02-a2cf-4349-afe3-292e38b50e33-config-data\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.820733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8sl\" (UniqueName: \"kubernetes.io/projected/81c4fa02-a2cf-4349-afe3-292e38b50e33-kube-api-access-5v8sl\") pod \"ceilometer-0\" (UID: \"81c4fa02-a2cf-4349-afe3-292e38b50e33\") " pod="openstack/ceilometer-0" Mar 20 17:40:43 crc kubenswrapper[4795]: I0320 17:40:43.875327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:40:44 crc kubenswrapper[4795]: W0320 17:40:44.372185 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81c4fa02_a2cf_4349_afe3_292e38b50e33.slice/crio-aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b WatchSource:0}: Error finding container aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b: Status 404 returned error can't find the container with id aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.380210 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.474778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"aaf8bb02d9338d0c6b8f9c03a6a32821c74dddabf800ebc89f130aa76508ae1b"} Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.479446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerStarted","Data":"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e"} Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.479475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerStarted","Data":"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb"} Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.518848 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.518828558 podStartE2EDuration="2.518828558s" podCreationTimestamp="2026-03-20 17:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:44.501941123 +0000 UTC m=+1387.959972694" watchObservedRunningTime="2026-03-20 17:40:44.518828558 +0000 UTC m=+1387.976860109" Mar 20 17:40:44 crc kubenswrapper[4795]: I0320 17:40:44.732400 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.265255 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0" path="/var/lib/kubelet/pods/e15f36e1-3fd7-43bc-9aaa-d793c6a43fd0/volumes" Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.490874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"256d2365325acd1f314b08010c8051a4b1f410b7e0de4b9f549eb9f57929213f"} Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.823240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.904866 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:45 crc kubenswrapper[4795]: I0320 17:40:45.905195 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" containerID="cri-o://fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" gracePeriod=10 Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.473433 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474787 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.474905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") pod \"3d499f64-fbe0-4f89-af22-619a306e7857\" (UID: \"3d499f64-fbe0-4f89-af22-619a306e7857\") " Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.485885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l" (OuterVolumeSpecName: "kube-api-access-c9h7l") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "kube-api-access-c9h7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.515915 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d499f64-fbe0-4f89-af22-619a306e7857" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" exitCode=0 Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.515984 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.515986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerDied","Data":"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.516016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-gnz2j" event={"ID":"3d499f64-fbe0-4f89-af22-619a306e7857","Type":"ContainerDied","Data":"033d346b30a1db4f7e9a01124daad98761199eedb3969eb07af8bdf4a1a9d7f0"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.516047 4795 scope.go:117] "RemoveContainer" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.522265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"17e7314f308a8f19b6455d8e266b166b150633defda1fb6d5be070ab71fa388a"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.522297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"4b95271b2da881de9a6e5f405f425092a87dc98b58d5b4d26d8419924b278256"} Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.536031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.541031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.542587 4795 scope.go:117] "RemoveContainer" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.553344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config" (OuterVolumeSpecName: "config") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.556946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.560790 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d499f64-fbe0-4f89-af22-619a306e7857" (UID: "3d499f64-fbe0-4f89-af22-619a306e7857"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.566265 4795 scope.go:117] "RemoveContainer" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" Mar 20 17:40:46 crc kubenswrapper[4795]: E0320 17:40:46.566742 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25\": container with ID starting with fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25 not found: ID does not exist" containerID="fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.566768 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25"} err="failed to get container status \"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25\": rpc error: code = NotFound desc = could not find container \"fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25\": container with ID starting with fd174cd825d5021fd0ce01d2813a44a96af6fad0393c375e93c62d062e934c25 not found: ID does not exist" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.566888 4795 scope.go:117] "RemoveContainer" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" Mar 20 17:40:46 crc kubenswrapper[4795]: E0320 17:40:46.567162 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953\": container with ID starting with bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953 not found: ID does not exist" containerID="bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.567180 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953"} err="failed to get container status \"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953\": rpc error: code = NotFound desc = could not find container \"bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953\": container with ID starting with bb41594fc00c4dc9abf4e323cac5401398688c569a4bba0ee09ea73b8a2b1953 not found: ID does not exist" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577077 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577124 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577133 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577143 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577154 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d499f64-fbe0-4f89-af22-619a306e7857-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.577162 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9h7l\" (UniqueName: \"kubernetes.io/projected/3d499f64-fbe0-4f89-af22-619a306e7857-kube-api-access-c9h7l\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.847196 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:46 crc kubenswrapper[4795]: I0320 17:40:46.858473 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-gnz2j"] Mar 20 17:40:47 crc kubenswrapper[4795]: I0320 17:40:47.261521 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" path="/var/lib/kubelet/pods/3d499f64-fbe0-4f89-af22-619a306e7857/volumes" Mar 20 17:40:47 crc kubenswrapper[4795]: I0320 17:40:47.313089 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6697f55ff5-fj55x" podUID="e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 20 17:40:49 crc kubenswrapper[4795]: I0320 17:40:49.732048 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:49 crc kubenswrapper[4795]: I0320 17:40:49.762886 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.570476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"81c4fa02-a2cf-4349-afe3-292e38b50e33","Type":"ContainerStarted","Data":"7234f4ee023a6896871077fc9773be9638c0383d34fbfecc5d3d8e15abd99bd9"} Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.595649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.600430 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.530317599 podStartE2EDuration="7.600414079s" podCreationTimestamp="2026-03-20 17:40:43 +0000 UTC" firstStartedPulling="2026-03-20 17:40:44.374333949 +0000 UTC m=+1387.832365490" lastFinishedPulling="2026-03-20 17:40:49.444430389 +0000 UTC m=+1392.902461970" observedRunningTime="2026-03-20 17:40:50.591764175 +0000 UTC m=+1394.049795736" watchObservedRunningTime="2026-03-20 17:40:50.600414079 +0000 UTC m=+1394.058445620" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.743978 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:40:50 crc kubenswrapper[4795]: E0320 17:40:50.745185 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="init" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.745260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="init" Mar 20 17:40:50 crc kubenswrapper[4795]: E0320 17:40:50.745326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.745383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.745614 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d499f64-fbe0-4f89-af22-619a306e7857" containerName="dnsmasq-dns" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.746260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.748488 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.748995 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.759843 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.765879 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.824067 4795 scope.go:117] "RemoveContainer" containerID="22dcfbd2225d9c0ffa8966a0b94e82b8d86d62d5548dc394c4f180ba099a7edd" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862234 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.862263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.867769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.868898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.869047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:50 crc kubenswrapper[4795]: I0320 17:40:50.878412 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"nova-cell1-cell-mapping-67csj\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.067573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.519299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:40:51 crc kubenswrapper[4795]: W0320 17:40:51.525632 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbc8602c_1f19_4825_b3e5_32d643f12430.slice/crio-a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28 WatchSource:0}: Error finding container a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28: Status 404 returned error can't find the container with id a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28 Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.585239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerStarted","Data":"a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28"} Mar 20 17:40:51 crc kubenswrapper[4795]: I0320 17:40:51.586195 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.597998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerStarted","Data":"c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb"} Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.633430 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-67csj" podStartSLOduration=2.633401236 podStartE2EDuration="2.633401236s" podCreationTimestamp="2026-03-20 17:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:40:52.620410439 +0000 UTC m=+1396.078442020" watchObservedRunningTime="2026-03-20 17:40:52.633401236 +0000 UTC m=+1396.091432817" Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.827750 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:52 crc kubenswrapper[4795]: I0320 17:40:52.827825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:40:53 crc kubenswrapper[4795]: I0320 17:40:53.840949 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:53 crc kubenswrapper[4795]: I0320 17:40:53.841597 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:40:57 crc kubenswrapper[4795]: I0320 17:40:57.653550 4795 generic.go:334] "Generic (PLEG): container finished" podID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerID="c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb" exitCode=0 Mar 20 17:40:57 crc kubenswrapper[4795]: I0320 17:40:57.654139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerDied","Data":"c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb"} Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.097556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.256438 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") pod \"fbc8602c-1f19-4825-b3e5-32d643f12430\" (UID: \"fbc8602c-1f19-4825-b3e5-32d643f12430\") " Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.263968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts" (OuterVolumeSpecName: "scripts") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.266721 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4" (OuterVolumeSpecName: "kube-api-access-csvx4") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "kube-api-access-csvx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.294762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data" (OuterVolumeSpecName: "config-data") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.316051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbc8602c-1f19-4825-b3e5-32d643f12430" (UID: "fbc8602c-1f19-4825-b3e5-32d643f12430"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359775 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359815 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csvx4\" (UniqueName: \"kubernetes.io/projected/fbc8602c-1f19-4825-b3e5-32d643f12430-kube-api-access-csvx4\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359828 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.359840 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbc8602c-1f19-4825-b3e5-32d643f12430-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.677613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-67csj" event={"ID":"fbc8602c-1f19-4825-b3e5-32d643f12430","Type":"ContainerDied","Data":"a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28"} Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.677957 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f2f0762737db17239ed35ecf86be8d008034e0ff471bb6f8cfea0b1ea2fc28" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.677705 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-67csj" Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.930504 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.931091 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" containerID="cri-o://d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.931204 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" containerID="cri-o://3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.971028 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.971312 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" containerID="cri-o://380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.987456 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.987701 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" containerID="cri-o://6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" gracePeriod=30 Mar 20 17:40:59 crc kubenswrapper[4795]: I0320 17:40:59.987827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" containerID="cri-o://4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" gracePeriod=30 Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.471532 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.473727 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.474760 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 17:41:00 crc kubenswrapper[4795]: E0320 17:41:00.474816 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.703547 4795 generic.go:334] "Generic (PLEG): container finished" podID="58141da4-34b7-48d2-8648-8340b0e08c24" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" exitCode=143 Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.703644 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerDied","Data":"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd"} Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.710140 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" exitCode=143 Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.710199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerDied","Data":"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb"} Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.828281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:00 crc kubenswrapper[4795]: I0320 17:41:00.828342 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.553337 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.619145 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.646966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.647026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") pod \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\" (UID: \"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.647485 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs" (OuterVolumeSpecName: "logs") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.653340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8" (OuterVolumeSpecName: "kube-api-access-hhkc8") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "kube-api-access-hhkc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.674704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data" (OuterVolumeSpecName: "config-data") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.690924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.700143 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.720733 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" (UID: "dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739745 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" exitCode=0 Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerDied","Data":"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739827 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207","Type":"ContainerDied","Data":"587df48f97c3203a94e266d052630fadcdcbe04e7e5ddd68a72a13d942007e47"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739843 4795 scope.go:117] "RemoveContainer" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.739937 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746085 4795 generic.go:334] "Generic (PLEG): container finished" podID="58141da4-34b7-48d2-8648-8340b0e08c24" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" exitCode=0 Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerDied","Data":"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.746318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58141da4-34b7-48d2-8648-8340b0e08c24","Type":"ContainerDied","Data":"468acb8849abdd77a144584691b96b8cbadeae923d66dd538f230c0aee8d52cb"} Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.748791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") pod \"58141da4-34b7-48d2-8648-8340b0e08c24\" (UID: \"58141da4-34b7-48d2-8648-8340b0e08c24\") " Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749189 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749205 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhkc8\" (UniqueName: \"kubernetes.io/projected/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-kube-api-access-hhkc8\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749216 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749224 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749232 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.749240 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.750561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs" (OuterVolumeSpecName: "logs") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.751842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf" (OuterVolumeSpecName: "kube-api-access-zb8pf") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "kube-api-access-zb8pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.775077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data" (OuterVolumeSpecName: "config-data") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.779677 4795 scope.go:117] "RemoveContainer" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.781066 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.807478 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.815385 4795 scope.go:117] "RemoveContainer" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.815824 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e\": container with ID starting with 3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e not found: ID does not exist" containerID="3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.815856 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e"} err="failed to get container status \"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e\": rpc error: code = NotFound desc = could not find container \"3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e\": container with ID starting with 3778c114ac9fd3c17c7d0d0ad58c1ee94d4666dd338abfa00364ba0c905bea6e not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.815880 4795 scope.go:117] "RemoveContainer" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.816156 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb\": container with ID starting with d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb not found: ID does not exist" containerID="d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.816175 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb"} err="failed to get container status \"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb\": rpc error: code = NotFound desc = could not find container \"d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb\": container with ID starting with d217946dad3e1f581bacf89e0e327543af3ee2c238176958d47a66b73be7f6fb not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.816187 4795 scope.go:117] "RemoveContainer" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.819994 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.834562 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.834966 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.834985 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835011 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835018 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835030 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerName="nova-manage" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835069 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerName="nova-manage" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.835083 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835091 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-api" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" containerName="nova-api-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835315 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-log" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835330 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" containerName="nova-metadata-metadata" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.835341 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" containerName="nova-manage" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.836525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.840913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.841016 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.841167 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.846163 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.846444 4795 scope.go:117] "RemoveContainer" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.847643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "58141da4-34b7-48d2-8648-8340b0e08c24" (UID: "58141da4-34b7-48d2-8648-8340b0e08c24"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850925 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58141da4-34b7-48d2-8648-8340b0e08c24-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850970 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850983 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.850992 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb8pf\" (UniqueName: \"kubernetes.io/projected/58141da4-34b7-48d2-8648-8340b0e08c24-kube-api-access-zb8pf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.851001 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58141da4-34b7-48d2-8648-8340b0e08c24-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.873462 4795 scope.go:117] "RemoveContainer" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.873921 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14\": container with ID starting with 4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14 not found: ID does not exist" containerID="4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.873948 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14"} err="failed to get container status \"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14\": rpc error: code = NotFound desc = could not find container \"4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14\": container with ID starting with 4918bc9b1721ddb39d10c2112c56a4ba68e6a7d1609b35ba9e13e8d117fa3e14 not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.873969 4795 scope.go:117] "RemoveContainer" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" Mar 20 17:41:03 crc kubenswrapper[4795]: E0320 17:41:03.874175 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd\": container with ID starting with 6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd not found: ID does not exist" containerID="6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.874211 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd"} err="failed to get container status \"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd\": rpc error: code = NotFound desc = could not find container \"6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd\": container with ID starting with 6feb3f057cfaaa1af4c2978a3f585dcc3fedfd63e9a30ba35e5f32d3fafad4cd not found: ID does not exist" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkjzt\" (UniqueName: \"kubernetes.io/projected/480a6609-0395-4bda-9ec8-a3ebf30931a7-kube-api-access-xkjzt\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-config-data\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480a6609-0395-4bda-9ec8-a3ebf30931a7-logs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:03 crc kubenswrapper[4795]: I0320 17:41:03.952541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054849 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkjzt\" (UniqueName: \"kubernetes.io/projected/480a6609-0395-4bda-9ec8-a3ebf30931a7-kube-api-access-xkjzt\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-config-data\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.054990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.055040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480a6609-0395-4bda-9ec8-a3ebf30931a7-logs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.055061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.056023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480a6609-0395-4bda-9ec8-a3ebf30931a7-logs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.059770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.060195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.060482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-config-data\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.062672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/480a6609-0395-4bda-9ec8-a3ebf30931a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.076051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkjzt\" (UniqueName: \"kubernetes.io/projected/480a6609-0395-4bda-9ec8-a3ebf30931a7-kube-api-access-xkjzt\") pod \"nova-api-0\" (UID: \"480a6609-0395-4bda-9ec8-a3ebf30931a7\") " pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.083561 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.102140 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.139736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.150415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.151936 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.152908 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.153154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.162233 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.257930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-config-data\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zw7f\" (UniqueName: \"kubernetes.io/projected/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-kube-api-access-6zw7f\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.258360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-logs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.359919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-logs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.360172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.360211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-config-data\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.361502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-logs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.361625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zw7f\" (UniqueName: \"kubernetes.io/projected/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-kube-api-access-6zw7f\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.361958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.367408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.369500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-config-data\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.384267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.388357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zw7f\" (UniqueName: \"kubernetes.io/projected/ff4af01b-01b5-4154-8591-7ec99e3d6ef0-kube-api-access-6zw7f\") pod \"nova-metadata-0\" (UID: \"ff4af01b-01b5-4154-8591-7ec99e3d6ef0\") " pod="openstack/nova-metadata-0" Mar 20 17:41:04 crc kubenswrapper[4795]: I0320 17:41:04.448534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.598799 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.608390 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759279 4795 generic.go:334] "Generic (PLEG): container finished" podID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" exitCode=0 Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759388 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerDied","Data":"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ef24c878-faa8-4a0b-a303-951d0a457eef","Type":"ContainerDied","Data":"84bfddf694ecbc3a7df8e31cc881d1692c1599f3d1dc0bd61b96f49386da6d0c"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.759452 4795 scope.go:117] "RemoveContainer" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.760894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"480a6609-0395-4bda-9ec8-a3ebf30931a7","Type":"ContainerStarted","Data":"befd9b02cdc9dc44ea8af5b4b2cc3a53e4fc63ae1c71353770f8bfbe9139169f"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.760922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"480a6609-0395-4bda-9ec8-a3ebf30931a7","Type":"ContainerStarted","Data":"83b850417ad4d2bdfcfb26a398d21951c423e8b6568b671ed54f3611a24d1406"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.769311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") pod \"ef24c878-faa8-4a0b-a303-951d0a457eef\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.769425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") pod \"ef24c878-faa8-4a0b-a303-951d0a457eef\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.769549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") pod \"ef24c878-faa8-4a0b-a303-951d0a457eef\" (UID: \"ef24c878-faa8-4a0b-a303-951d0a457eef\") " Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.773778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx" (OuterVolumeSpecName: "kube-api-access-jgnnx") pod "ef24c878-faa8-4a0b-a303-951d0a457eef" (UID: "ef24c878-faa8-4a0b-a303-951d0a457eef"). InnerVolumeSpecName "kube-api-access-jgnnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.784615 4795 scope.go:117] "RemoveContainer" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" Mar 20 17:41:05 crc kubenswrapper[4795]: E0320 17:41:04.785575 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463\": container with ID starting with 380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463 not found: ID does not exist" containerID="380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.785612 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463"} err="failed to get container status \"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463\": rpc error: code = NotFound desc = could not find container \"380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463\": container with ID starting with 380ddb60febe734d63f09397fca2197d92533805821c806d06d5d24e56a3a463 not found: ID does not exist" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.802308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef24c878-faa8-4a0b-a303-951d0a457eef" (UID: "ef24c878-faa8-4a0b-a303-951d0a457eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.804813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data" (OuterVolumeSpecName: "config-data") pod "ef24c878-faa8-4a0b-a303-951d0a457eef" (UID: "ef24c878-faa8-4a0b-a303-951d0a457eef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.871836 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.871867 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef24c878-faa8-4a0b-a303-951d0a457eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:04.871879 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgnnx\" (UniqueName: \"kubernetes.io/projected/ef24c878-faa8-4a0b-a303-951d0a457eef-kube-api-access-jgnnx\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.099426 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.127746 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.160161 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: E0320 17:41:05.160570 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.160584 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.160800 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" containerName="nova-scheduler-scheduler" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.161400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.163852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.168520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.270241 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58141da4-34b7-48d2-8648-8340b0e08c24" path="/var/lib/kubelet/pods/58141da4-34b7-48d2-8648-8340b0e08c24/volumes" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.271224 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207" path="/var/lib/kubelet/pods/dc7f1fae-ee02-4e5c-a06c-9cfacbdc5207/volumes" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.271970 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef24c878-faa8-4a0b-a303-951d0a457eef" path="/var/lib/kubelet/pods/ef24c878-faa8-4a0b-a303-951d0a457eef/volumes" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.279699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-config-data\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.279816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.279860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2lnp\" (UniqueName: \"kubernetes.io/projected/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-kube-api-access-t2lnp\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.382122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-config-data\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.382248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.382296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2lnp\" (UniqueName: \"kubernetes.io/projected/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-kube-api-access-t2lnp\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.387644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-config-data\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.399956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.400797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2lnp\" (UniqueName: \"kubernetes.io/projected/c23f56ff-eceb-4891-87e5-57ebeb7eba8d-kube-api-access-t2lnp\") pod \"nova-scheduler-0\" (UID: \"c23f56ff-eceb-4891-87e5-57ebeb7eba8d\") " pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.520939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.545297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.770285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff4af01b-01b5-4154-8591-7ec99e3d6ef0","Type":"ContainerStarted","Data":"799bc58ae9c31f2b20b85a53bf74bab2facc31011fbf66c3eaf5d018bd956aca"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.773743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"480a6609-0395-4bda-9ec8-a3ebf30931a7","Type":"ContainerStarted","Data":"083a7a13f04e5180c257f05356b9040c12dea0510d5f921c7ebdd383b6985b45"} Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.808598 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.808573643 podStartE2EDuration="2.808573643s" podCreationTimestamp="2026-03-20 17:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:05.793983709 +0000 UTC m=+1409.252015270" watchObservedRunningTime="2026-03-20 17:41:05.808573643 +0000 UTC m=+1409.266605204" Mar 20 17:41:05 crc kubenswrapper[4795]: W0320 17:41:05.984127 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23f56ff_eceb_4891_87e5_57ebeb7eba8d.slice/crio-8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776 WatchSource:0}: Error finding container 8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776: Status 404 returned error can't find the container with id 8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776 Mar 20 17:41:05 crc kubenswrapper[4795]: I0320 17:41:05.987931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.784929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c23f56ff-eceb-4891-87e5-57ebeb7eba8d","Type":"ContainerStarted","Data":"85945fef7e7e4818965310af4a0201164d31db19ff0f5e8619e3262d43b4c864"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.785231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c23f56ff-eceb-4891-87e5-57ebeb7eba8d","Type":"ContainerStarted","Data":"8ad662a0532ae35ef56ef051374a286dced762b665b30f581b6dbf6787f71776"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.789058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff4af01b-01b5-4154-8591-7ec99e3d6ef0","Type":"ContainerStarted","Data":"bb468ec1e38e6252f1835d5a48e30b0929f67abd3f7646a4757e2335b49b5959"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.789297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff4af01b-01b5-4154-8591-7ec99e3d6ef0","Type":"ContainerStarted","Data":"b10f2f42fbd9069430b1583e9cb52464dfc2872b3963701e1e04e31f7d461b45"} Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.807893 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.807872503 podStartE2EDuration="1.807872503s" podCreationTimestamp="2026-03-20 17:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:06.803932512 +0000 UTC m=+1410.261964053" watchObservedRunningTime="2026-03-20 17:41:06.807872503 +0000 UTC m=+1410.265904044" Mar 20 17:41:06 crc kubenswrapper[4795]: I0320 17:41:06.830651 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.830628977 podStartE2EDuration="2.830628977s" podCreationTimestamp="2026-03-20 17:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:06.82742327 +0000 UTC m=+1410.285454851" watchObservedRunningTime="2026-03-20 17:41:06.830628977 +0000 UTC m=+1410.288660528" Mar 20 17:41:10 crc kubenswrapper[4795]: I0320 17:41:10.521347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:41:11 crc kubenswrapper[4795]: I0320 17:41:11.299931 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:41:11 crc kubenswrapper[4795]: I0320 17:41:11.300289 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:41:13 crc kubenswrapper[4795]: I0320 17:41:13.888515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.152921 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.152959 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.449755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:41:14 crc kubenswrapper[4795]: I0320 17:41:14.449819 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.167894 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="480a6609-0395-4bda-9ec8-a3ebf30931a7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.167898 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="480a6609-0395-4bda-9ec8-a3ebf30931a7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.459813 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff4af01b-01b5-4154-8591-7ec99e3d6ef0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.460062 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff4af01b-01b5-4154-8591-7ec99e3d6ef0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.521174 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.566010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:41:15 crc kubenswrapper[4795]: I0320 17:41:15.925566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.152863 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.153455 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.449559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:41:22 crc kubenswrapper[4795]: I0320 17:41:22.449954 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.159320 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.159739 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.169092 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.170027 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.455604 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.457266 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.461465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:41:24 crc kubenswrapper[4795]: I0320 17:41:24.993919 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:41:32 crc kubenswrapper[4795]: I0320 17:41:32.909300 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:33 crc kubenswrapper[4795]: I0320 17:41:33.920078 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:36 crc kubenswrapper[4795]: I0320 17:41:36.038607 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" containerID="cri-o://95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" gracePeriod=57 Mar 20 17:41:36 crc kubenswrapper[4795]: I0320 17:41:36.781148 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" containerID="cri-o://930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3" gracePeriod=58 Mar 20 17:41:37 crc kubenswrapper[4795]: I0320 17:41:37.500176 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 20 17:41:37 crc kubenswrapper[4795]: I0320 17:41:37.832677 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.125667 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerID="930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3" exitCode=0 Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.125770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerDied","Data":"930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3"} Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.417345 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583271 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583633 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583655 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.583746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") pod \"d3e6834b-7e74-46f8-a734-b473080c05d3\" (UID: \"d3e6834b-7e74-46f8-a734-b473080c05d3\") " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.584564 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.584891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.588506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.589033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info" (OuterVolumeSpecName: "pod-info") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.589307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.590773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.592157 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.604177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k" (OuterVolumeSpecName: "kube-api-access-j6r2k") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "kube-api-access-j6r2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.627416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data" (OuterVolumeSpecName: "config-data") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.636585 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf" (OuterVolumeSpecName: "server-conf") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685572 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685603 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3e6834b-7e74-46f8-a734-b473080c05d3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685618 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6r2k\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-kube-api-access-j6r2k\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685649 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685660 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685723 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685738 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685749 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3e6834b-7e74-46f8-a734-b473080c05d3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685761 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.685771 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3e6834b-7e74-46f8-a734-b473080c05d3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.703028 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d3e6834b-7e74-46f8-a734-b473080c05d3" (UID: "d3e6834b-7e74-46f8-a734-b473080c05d3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.711966 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.787808 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3e6834b-7e74-46f8-a734-b473080c05d3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:38 crc kubenswrapper[4795]: I0320 17:41:38.787843 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.143736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d3e6834b-7e74-46f8-a734-b473080c05d3","Type":"ContainerDied","Data":"12a00ee882324adc5e7b3fa5833c8430141d6a20302db2d5f549cf873b0d421d"} Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.143797 4795 scope.go:117] "RemoveContainer" containerID="930cc5d12a5b8ceb897b37f689f02ab87b93b53244832868d8761d1d4336b1e3" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.144143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.167707 4795 scope.go:117] "RemoveContainer" containerID="5473602d5499b1067c63d6b98d02f2810f56405e993453774e2f6c5d19c36aea" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.202022 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.234601 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248096 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: E0320 17:41:39.248571 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248597 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" Mar 20 17:41:39 crc kubenswrapper[4795]: E0320 17:41:39.248635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="setup-container" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="setup-container" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.248870 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" containerName="rabbitmq" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.250172 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254149 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.254899 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.255000 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.255134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pf5bc" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.255253 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.278915 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e6834b-7e74-46f8-a734-b473080c05d3" path="/var/lib/kubelet/pods/d3e6834b-7e74-46f8-a734-b473080c05d3/volumes" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.279646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.399905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.399981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400137 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400335 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cbc\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-kube-api-access-c6cbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.400405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6cbc\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-kube-api-access-c6cbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.502818 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.503026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.503130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.504027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.504179 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.504735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.506489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.506490 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.510473 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.512237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.512628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.539099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6cbc\" (UniqueName: \"kubernetes.io/projected/30c1ffc4-752a-4b0a-a95b-2bfbc458dc53-kube-api-access-c6cbc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.539178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:39 crc kubenswrapper[4795]: I0320 17:41:39.598011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:41:40 crc kubenswrapper[4795]: W0320 17:41:40.127626 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c1ffc4_752a_4b0a_a95b_2bfbc458dc53.slice/crio-1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f WatchSource:0}: Error finding container 1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f: Status 404 returned error can't find the container with id 1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f Mar 20 17:41:40 crc kubenswrapper[4795]: I0320 17:41:40.128449 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:41:40 crc kubenswrapper[4795]: I0320 17:41:40.156108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerStarted","Data":"1a67d7e8adeac60383a3acf7797cced3bf60bcab9e2b95a0728c099729e98f9f"} Mar 20 17:41:41 crc kubenswrapper[4795]: I0320 17:41:41.299988 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:41:41 crc kubenswrapper[4795]: I0320 17:41:41.300027 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.178357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerStarted","Data":"91b2765becbe485413b561b8cb2a1ab6831f2a2f0328f3ad53837ee41431baef"} Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.768545 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.871929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873080 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873128 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873478 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.873752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") pod \"b8103489-e552-49b0-a32a-1069a46feff9\" (UID: \"b8103489-e552-49b0-a32a-1069a46feff9\") " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.874774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.879997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.880192 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.880357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b" (OuterVolumeSpecName: "kube-api-access-w4h5b") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "kube-api-access-w4h5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.887200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.888068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.891879 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.902870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.903967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.917243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data" (OuterVolumeSpecName: "config-data") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.975964 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976002 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976014 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4h5b\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-kube-api-access-w4h5b\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976024 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976033 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8103489-e552-49b0-a32a-1069a46feff9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976059 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976068 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8103489-e552-49b0-a32a-1069a46feff9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976077 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976085 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:42 crc kubenswrapper[4795]: I0320 17:41:42.976095 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8103489-e552-49b0-a32a-1069a46feff9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.002141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8103489-e552-49b0-a32a-1069a46feff9" (UID: "b8103489-e552-49b0-a32a-1069a46feff9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.003632 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.077885 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.077914 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8103489-e552-49b0-a32a-1069a46feff9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187310 4795 generic.go:334] "Generic (PLEG): container finished" podID="b8103489-e552-49b0-a32a-1069a46feff9" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" exitCode=0 Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerDied","Data":"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3"} Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b8103489-e552-49b0-a32a-1069a46feff9","Type":"ContainerDied","Data":"0e5a7ece35e45546c5839b24c64b62f1f72a3acb63297d6f97fc0dca60bde01d"} Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187442 4795 scope.go:117] "RemoveContainer" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.187576 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.239700 4795 scope.go:117] "RemoveContainer" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.245875 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.267095 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.270133 4795 scope.go:117] "RemoveContainer" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.272597 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3\": container with ID starting with 95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3 not found: ID does not exist" containerID="95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.272631 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3"} err="failed to get container status \"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3\": rpc error: code = NotFound desc = could not find container \"95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3\": container with ID starting with 95a7f0a74b1520409cb1964b43583b3bb36ff5d4026d8a725ce85cd7a2dec8e3 not found: ID does not exist" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.272651 4795 scope.go:117] "RemoveContainer" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.275820 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157\": container with ID starting with ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157 not found: ID does not exist" containerID="ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.275852 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157"} err="failed to get container status \"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157\": rpc error: code = NotFound desc = could not find container \"ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157\": container with ID starting with ec7505bc396d477cfe1e79cc570089d527865cfa69d962631f6ff4ab0b704157 not found: ID does not exist" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278020 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.278398 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278410 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.278431 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="setup-container" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278437 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="setup-container" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.278625 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8103489-e552-49b0-a32a-1069a46feff9" containerName="rabbitmq" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.279624 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.283826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-84wwv" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.284130 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.284244 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.284448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.285919 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.285943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.285969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.295159 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:43 crc kubenswrapper[4795]: E0320 17:41:43.348799 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8103489_e552_49b0_a32a_1069a46feff9.slice/crio-0e5a7ece35e45546c5839b24c64b62f1f72a3acb63297d6f97fc0dca60bde01d\": RecentStats: unable to find data in memory cache]" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchqp\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-kube-api-access-xchqp\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.408989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.409020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.409099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.409728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.511713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.511833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.511959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchqp\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-kube-api-access-xchqp\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.512964 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.513810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.514232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.514302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.515746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.530622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.531191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.531272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.533728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.546607 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchqp\" (UniqueName: \"kubernetes.io/projected/ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc-kube-api-access-xchqp\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.565128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc\") " pod="openstack/rabbitmq-server-0" Mar 20 17:41:43 crc kubenswrapper[4795]: I0320 17:41:43.646542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:41:44 crc kubenswrapper[4795]: I0320 17:41:44.177108 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:41:44 crc kubenswrapper[4795]: W0320 17:41:44.183566 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad841f4d_fa5f_4383_86d5_ab5a93f6e7fc.slice/crio-3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d WatchSource:0}: Error finding container 3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d: Status 404 returned error can't find the container with id 3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d Mar 20 17:41:44 crc kubenswrapper[4795]: I0320 17:41:44.199175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerStarted","Data":"3fe36fa21c6591620fd3178e1c1040a796f33f9c0e121e731657b304af794a0d"} Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.265494 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8103489-e552-49b0-a32a-1069a46feff9" path="/var/lib/kubelet/pods/b8103489-e552-49b0-a32a-1069a46feff9/volumes" Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.966499 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.968622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.970525 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 17:41:45 crc kubenswrapper[4795]: I0320 17:41:45.997073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.069789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.171614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.172581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.173064 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.173338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.173359 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.189404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"dnsmasq-dns-79bd4cc8c9-ttfhl\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.219081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerStarted","Data":"6c9bb98a5b27dfaab93f9b0bf86e2dc36843d779c16099e2338e7ce5f1541db7"} Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.290762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:46 crc kubenswrapper[4795]: I0320 17:41:46.547153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:41:47 crc kubenswrapper[4795]: I0320 17:41:47.231953 4795 generic.go:334] "Generic (PLEG): container finished" podID="38f2311f-ace5-4469-906b-05443d175f81" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" exitCode=0 Mar 20 17:41:47 crc kubenswrapper[4795]: I0320 17:41:47.232030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerDied","Data":"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff"} Mar 20 17:41:47 crc kubenswrapper[4795]: I0320 17:41:47.232423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerStarted","Data":"8a4135aec749bd5b9e65098bed9d5a85b6fe9d80a88ef7fade7429aaedbcd5f3"} Mar 20 17:41:48 crc kubenswrapper[4795]: I0320 17:41:48.243914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerStarted","Data":"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824"} Mar 20 17:41:48 crc kubenswrapper[4795]: I0320 17:41:48.244470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:48 crc kubenswrapper[4795]: I0320 17:41:48.284103 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" podStartSLOduration=3.284077361 podStartE2EDuration="3.284077361s" podCreationTimestamp="2026-03-20 17:41:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:48.271799247 +0000 UTC m=+1451.729830868" watchObservedRunningTime="2026-03-20 17:41:48.284077361 +0000 UTC m=+1451.742108942" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.162485 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.166290 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.184881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.292906 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.325786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.325962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.326100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.378145 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.378460 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" containerID="cri-o://9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" gracePeriod=10 Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.431282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.431739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.431818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.433368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.433903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.460203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"certified-operators-8rxkd\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.528068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.565479 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ch8jm"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.568414 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.610660 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ch8jm"] Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-config\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdc26\" (UniqueName: \"kubernetes.io/projected/5c5c2934-fe58-4707-9bb7-a5e2372bad83-kube-api-access-kdc26\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.641534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.742836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.742987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdc26\" (UniqueName: \"kubernetes.io/projected/5c5c2934-fe58-4707-9bb7-a5e2372bad83-kube-api-access-kdc26\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-config\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.743455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.744317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.744707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.744978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.745270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.745621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.745817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c5c2934-fe58-4707-9bb7-a5e2372bad83-config\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:56 crc kubenswrapper[4795]: I0320 17:41:56.791380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdc26\" (UniqueName: \"kubernetes.io/projected/5c5c2934-fe58-4707-9bb7-a5e2372bad83-kube-api-access-kdc26\") pod \"dnsmasq-dns-54ffdb7d8c-ch8jm\" (UID: \"5c5c2934-fe58-4707-9bb7-a5e2372bad83\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.006755 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.013040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049812 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.049993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.050076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.050143 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") pod \"35b8efb0-212f-4ee0-bb05-4655aff260b5\" (UID: \"35b8efb0-212f-4ee0-bb05-4655aff260b5\") " Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.058444 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j" (OuterVolumeSpecName: "kube-api-access-dmn5j") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "kube-api-access-dmn5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.104259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.108240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.112068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.121197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config" (OuterVolumeSpecName: "config") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.123024 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "35b8efb0-212f-4ee0-bb05-4655aff260b5" (UID: "35b8efb0-212f-4ee0-bb05-4655aff260b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156108 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmn5j\" (UniqueName: \"kubernetes.io/projected/35b8efb0-212f-4ee0-bb05-4655aff260b5-kube-api-access-dmn5j\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156133 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156142 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156150 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156158 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.156167 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35b8efb0-212f-4ee0-bb05-4655aff260b5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.231071 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.428808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerStarted","Data":"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.428874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerStarted","Data":"5fc09bb80d702cbcbe035b81e8b835d219b1710a866cf58373f31844d607a73c"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.431410 4795 generic.go:334] "Generic (PLEG): container finished" podID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" exitCode=0 Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.431449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.431466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerDied","Data":"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.432002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-g7pv7" event={"ID":"35b8efb0-212f-4ee0-bb05-4655aff260b5","Type":"ContainerDied","Data":"54687987edb9f2765e8d4f7b8bfef3664f1024d2c67848e40765c69ff1c22cea"} Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.432041 4795 scope.go:117] "RemoveContainer" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.449912 4795 scope.go:117] "RemoveContainer" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.460567 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.467589 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-g7pv7"] Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.473324 4795 scope.go:117] "RemoveContainer" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" Mar 20 17:41:57 crc kubenswrapper[4795]: E0320 17:41:57.474951 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8\": container with ID starting with 9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8 not found: ID does not exist" containerID="9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.474990 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8"} err="failed to get container status \"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8\": rpc error: code = NotFound desc = could not find container \"9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8\": container with ID starting with 9319cb95ba4e6916735d434a5dc09475ce20e0823d63f5dbe3c0c4b17579b3a8 not found: ID does not exist" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.475012 4795 scope.go:117] "RemoveContainer" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" Mar 20 17:41:57 crc kubenswrapper[4795]: E0320 17:41:57.475918 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087\": container with ID starting with b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087 not found: ID does not exist" containerID="b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087" Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.475974 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087"} err="failed to get container status \"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087\": rpc error: code = NotFound desc = could not find container \"b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087\": container with ID starting with b7e53f97bd96ed250e5f1dc1eb4d8a35d4e93ed2609174b04fee4ff1d2f04087 not found: ID does not exist" Mar 20 17:41:57 crc kubenswrapper[4795]: W0320 17:41:57.499512 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c5c2934_fe58_4707_9bb7_a5e2372bad83.slice/crio-ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3 WatchSource:0}: Error finding container ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3: Status 404 returned error can't find the container with id ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3 Mar 20 17:41:57 crc kubenswrapper[4795]: I0320 17:41:57.504879 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-ch8jm"] Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.445113 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac35d627-20df-4aad-9779-e154f9cb617a" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" exitCode=0 Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.445184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde"} Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.453601 4795 generic.go:334] "Generic (PLEG): container finished" podID="5c5c2934-fe58-4707-9bb7-a5e2372bad83" containerID="e5dd02238cefff0ea9b15c3254b140992155778ea40801eac45bd1f08b16bf9c" exitCode=0 Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.453679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" event={"ID":"5c5c2934-fe58-4707-9bb7-a5e2372bad83","Type":"ContainerDied","Data":"e5dd02238cefff0ea9b15c3254b140992155778ea40801eac45bd1f08b16bf9c"} Mar 20 17:41:58 crc kubenswrapper[4795]: I0320 17:41:58.453796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" event={"ID":"5c5c2934-fe58-4707-9bb7-a5e2372bad83","Type":"ContainerStarted","Data":"ab74db3fa7a7744639604aaeb749f8f976df6b4fb83ed368049c20e9d80debc3"} Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.278154 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" path="/var/lib/kubelet/pods/35b8efb0-212f-4ee0-bb05-4655aff260b5/volumes" Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.479244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" event={"ID":"5c5c2934-fe58-4707-9bb7-a5e2372bad83","Type":"ContainerStarted","Data":"de2ab699cb1b7869321e76fdc3a3051d733e4a49eed44f4ec3e8d28f0e328652"} Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.479438 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:41:59 crc kubenswrapper[4795]: I0320 17:41:59.507928 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" podStartSLOduration=3.507905884 podStartE2EDuration="3.507905884s" podCreationTimestamp="2026-03-20 17:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:41:59.503011725 +0000 UTC m=+1462.961043276" watchObservedRunningTime="2026-03-20 17:41:59.507905884 +0000 UTC m=+1462.965937425" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.161487 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:42:00 crc kubenswrapper[4795]: E0320 17:42:00.161943 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.161964 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" Mar 20 17:42:00 crc kubenswrapper[4795]: E0320 17:42:00.162019 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="init" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.162028 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="init" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.162247 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b8efb0-212f-4ee0-bb05-4655aff260b5" containerName="dnsmasq-dns" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.163001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.167063 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.167997 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.168009 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.178177 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.223891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"auto-csr-approver-29567142-zrn58\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.326336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"auto-csr-approver-29567142-zrn58\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.346602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"auto-csr-approver-29567142-zrn58\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.482981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.491236 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac35d627-20df-4aad-9779-e154f9cb617a" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" exitCode=0 Mar 20 17:42:00 crc kubenswrapper[4795]: I0320 17:42:00.491331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5"} Mar 20 17:42:00 crc kubenswrapper[4795]: W0320 17:42:00.997536 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf931d18_2dae_408e_823d_45c28b0a31c2.slice/crio-1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6 WatchSource:0}: Error finding container 1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6: Status 404 returned error can't find the container with id 1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6 Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.014184 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.500282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerStarted","Data":"1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6"} Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.503934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerStarted","Data":"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66"} Mar 20 17:42:01 crc kubenswrapper[4795]: I0320 17:42:01.527016 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rxkd" podStartSLOduration=3.075351152 podStartE2EDuration="5.526996117s" podCreationTimestamp="2026-03-20 17:41:56 +0000 UTC" firstStartedPulling="2026-03-20 17:41:58.447928263 +0000 UTC m=+1461.905959844" lastFinishedPulling="2026-03-20 17:42:00.899573258 +0000 UTC m=+1464.357604809" observedRunningTime="2026-03-20 17:42:01.518984482 +0000 UTC m=+1464.977016033" watchObservedRunningTime="2026-03-20 17:42:01.526996117 +0000 UTC m=+1464.985027668" Mar 20 17:42:03 crc kubenswrapper[4795]: I0320 17:42:03.533361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerStarted","Data":"ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b"} Mar 20 17:42:04 crc kubenswrapper[4795]: I0320 17:42:04.548850 4795 generic.go:334] "Generic (PLEG): container finished" podID="df931d18-2dae-408e-823d-45c28b0a31c2" containerID="ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b" exitCode=0 Mar 20 17:42:04 crc kubenswrapper[4795]: I0320 17:42:04.548987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerDied","Data":"ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b"} Mar 20 17:42:04 crc kubenswrapper[4795]: I0320 17:42:04.937455 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.033282 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") pod \"df931d18-2dae-408e-823d-45c28b0a31c2\" (UID: \"df931d18-2dae-408e-823d-45c28b0a31c2\") " Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.039496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65" (OuterVolumeSpecName: "kube-api-access-r4s65") pod "df931d18-2dae-408e-823d-45c28b0a31c2" (UID: "df931d18-2dae-408e-823d-45c28b0a31c2"). InnerVolumeSpecName "kube-api-access-r4s65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.135248 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4s65\" (UniqueName: \"kubernetes.io/projected/df931d18-2dae-408e-823d-45c28b0a31c2-kube-api-access-r4s65\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.562657 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-zrn58" event={"ID":"df931d18-2dae-408e-823d-45c28b0a31c2","Type":"ContainerDied","Data":"1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6"} Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.563053 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba6445df7e0c562fcaa5313942ae9a5ee384f9a2cd71fe064100fa31959edd6" Mar 20 17:42:05 crc kubenswrapper[4795]: I0320 17:42:05.562796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-zrn58" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.031425 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.043916 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-j4mtv"] Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.528517 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.528602 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.619207 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.680658 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:06 crc kubenswrapper[4795]: E0320 17:42:06.681217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" containerName="oc" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.681241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" containerName="oc" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.681490 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" containerName="oc" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.709966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.710158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.743151 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.772181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.772323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.772403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.873678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.873815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.873881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.874458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.874532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:06 crc kubenswrapper[4795]: I0320 17:42:06.892882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"redhat-operators-lcjkp\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.014859 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffdb7d8c-ch8jm" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.048499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.075651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.075956 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" containerID="cri-o://c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" gracePeriod=10 Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.264740 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f88deb-b38d-4c52-a901-baeb9da08559" path="/var/lib/kubelet/pods/38f88deb-b38d-4c52-a901-baeb9da08559/volumes" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.574034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.583491 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.613512 4795 generic.go:334] "Generic (PLEG): container finished" podID="38f2311f-ace5-4469-906b-05443d175f81" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" exitCode=0 Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614414 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerDied","Data":"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824"} Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614930 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-ttfhl" event={"ID":"38f2311f-ace5-4469-906b-05443d175f81","Type":"ContainerDied","Data":"8a4135aec749bd5b9e65098bed9d5a85b6fe9d80a88ef7fade7429aaedbcd5f3"} Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.614950 4795 scope.go:117] "RemoveContainer" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.682974 4795 scope.go:117] "RemoveContainer" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692174 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.692832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") pod \"38f2311f-ace5-4469-906b-05443d175f81\" (UID: \"38f2311f-ace5-4469-906b-05443d175f81\") " Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.707567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z" (OuterVolumeSpecName: "kube-api-access-dqx4z") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "kube-api-access-dqx4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.727863 4795 scope.go:117] "RemoveContainer" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" Mar 20 17:42:07 crc kubenswrapper[4795]: E0320 17:42:07.730456 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824\": container with ID starting with c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824 not found: ID does not exist" containerID="c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.730526 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824"} err="failed to get container status \"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824\": rpc error: code = NotFound desc = could not find container \"c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824\": container with ID starting with c9e1102240e3415b8d102ed76951c97c9afb306467d3d068e8599e0fc711a824 not found: ID does not exist" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.730555 4795 scope.go:117] "RemoveContainer" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" Mar 20 17:42:07 crc kubenswrapper[4795]: E0320 17:42:07.732301 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff\": container with ID starting with cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff not found: ID does not exist" containerID="cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.732329 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff"} err="failed to get container status \"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff\": rpc error: code = NotFound desc = could not find container \"cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff\": container with ID starting with cd3feb7b56defa28a03834a58f951355bb1aad855fec73711fd46c0981cb81ff not found: ID does not exist" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.765812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config" (OuterVolumeSpecName: "config") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.765855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.769839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.782766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.791063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.795959 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.795998 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796012 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqx4z\" (UniqueName: \"kubernetes.io/projected/38f2311f-ace5-4469-906b-05443d175f81-kube-api-access-dqx4z\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796025 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796036 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.796048 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.804080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "38f2311f-ace5-4469-906b-05443d175f81" (UID: "38f2311f-ace5-4469-906b-05443d175f81"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.897243 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/38f2311f-ace5-4469-906b-05443d175f81-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.986277 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:42:07 crc kubenswrapper[4795]: I0320 17:42:07.996946 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-ttfhl"] Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.217872 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631149 4795 generic.go:334] "Generic (PLEG): container finished" podID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" exitCode=0 Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa"} Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerStarted","Data":"fabd95db1d654ea27cdf8aa4144ea02558f6e4e1468eabf7d4dab5f79064ffa1"} Mar 20 17:42:08 crc kubenswrapper[4795]: I0320 17:42:08.631472 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rxkd" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" containerID="cri-o://f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" gracePeriod=2 Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.106785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.223124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") pod \"ac35d627-20df-4aad-9779-e154f9cb617a\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.223173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") pod \"ac35d627-20df-4aad-9779-e154f9cb617a\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.223245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") pod \"ac35d627-20df-4aad-9779-e154f9cb617a\" (UID: \"ac35d627-20df-4aad-9779-e154f9cb617a\") " Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.224133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities" (OuterVolumeSpecName: "utilities") pod "ac35d627-20df-4aad-9779-e154f9cb617a" (UID: "ac35d627-20df-4aad-9779-e154f9cb617a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.224375 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.231554 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg" (OuterVolumeSpecName: "kube-api-access-2vrmg") pod "ac35d627-20df-4aad-9779-e154f9cb617a" (UID: "ac35d627-20df-4aad-9779-e154f9cb617a"). InnerVolumeSpecName "kube-api-access-2vrmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.278414 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f2311f-ace5-4469-906b-05443d175f81" path="/var/lib/kubelet/pods/38f2311f-ace5-4469-906b-05443d175f81/volumes" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.327265 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vrmg\" (UniqueName: \"kubernetes.io/projected/ac35d627-20df-4aad-9779-e154f9cb617a-kube-api-access-2vrmg\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.328766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac35d627-20df-4aad-9779-e154f9cb617a" (UID: "ac35d627-20df-4aad-9779-e154f9cb617a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.429120 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac35d627-20df-4aad-9779-e154f9cb617a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687749 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac35d627-20df-4aad-9779-e154f9cb617a" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" exitCode=0 Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66"} Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rxkd" event={"ID":"ac35d627-20df-4aad-9779-e154f9cb617a","Type":"ContainerDied","Data":"5fc09bb80d702cbcbe035b81e8b835d219b1710a866cf58373f31844d607a73c"} Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687922 4795 scope.go:117] "RemoveContainer" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.687987 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rxkd" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.710074 4795 scope.go:117] "RemoveContainer" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.739790 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.741006 4795 scope.go:117] "RemoveContainer" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.749029 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rxkd"] Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.811898 4795 scope.go:117] "RemoveContainer" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" Mar 20 17:42:09 crc kubenswrapper[4795]: E0320 17:42:09.812469 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66\": container with ID starting with f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66 not found: ID does not exist" containerID="f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812511 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66"} err="failed to get container status \"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66\": rpc error: code = NotFound desc = could not find container \"f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66\": container with ID starting with f7cc83b4ce64f913ae3d5fd6c61956cb7b0fbb36ef3d7c285365fd90b2045c66 not found: ID does not exist" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812537 4795 scope.go:117] "RemoveContainer" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" Mar 20 17:42:09 crc kubenswrapper[4795]: E0320 17:42:09.812870 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5\": container with ID starting with e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5 not found: ID does not exist" containerID="e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812910 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5"} err="failed to get container status \"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5\": rpc error: code = NotFound desc = could not find container \"e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5\": container with ID starting with e13b2f94e4adb1fc834b83ab2e6eadddae7ae552979c696ad43283014208caa5 not found: ID does not exist" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.812935 4795 scope.go:117] "RemoveContainer" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" Mar 20 17:42:09 crc kubenswrapper[4795]: E0320 17:42:09.813193 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde\": container with ID starting with a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde not found: ID does not exist" containerID="a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde" Mar 20 17:42:09 crc kubenswrapper[4795]: I0320 17:42:09.813216 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde"} err="failed to get container status \"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde\": rpc error: code = NotFound desc = could not find container \"a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde\": container with ID starting with a420272dc5141a90656e3cdca0b58b2397cc7f71695f254fca2019d35c524fde not found: ID does not exist" Mar 20 17:42:10 crc kubenswrapper[4795]: I0320 17:42:10.703354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerStarted","Data":"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561"} Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.274370 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" path="/var/lib/kubelet/pods/ac35d627-20df-4aad-9779-e154f9cb617a/volumes" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.301074 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.301173 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.301239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.303001 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:42:11 crc kubenswrapper[4795]: I0320 17:42:11.303137 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1" gracePeriod=600 Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.734382 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1" exitCode=0 Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.734447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1"} Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.735050 4795 scope.go:117] "RemoveContainer" containerID="98738c7a776083a3e8bececf1e7b1ed94fe90e2849a5325ee062c3833ce7e7dd" Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.739634 4795 generic.go:334] "Generic (PLEG): container finished" podID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" exitCode=0 Mar 20 17:42:12 crc kubenswrapper[4795]: I0320 17:42:12.739715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.758751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerStarted","Data":"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.762337 4795 generic.go:334] "Generic (PLEG): container finished" podID="30c1ffc4-752a-4b0a-a95b-2bfbc458dc53" containerID="91b2765becbe485413b561b8cb2a1ab6831f2a2f0328f3ad53837ee41431baef" exitCode=0 Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.762429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerDied","Data":"91b2765becbe485413b561b8cb2a1ab6831f2a2f0328f3ad53837ee41431baef"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.772085 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48"} Mar 20 17:42:13 crc kubenswrapper[4795]: I0320 17:42:13.803808 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lcjkp" podStartSLOduration=3.062137191 podStartE2EDuration="7.803787006s" podCreationTimestamp="2026-03-20 17:42:06 +0000 UTC" firstStartedPulling="2026-03-20 17:42:08.636319115 +0000 UTC m=+1472.094350666" lastFinishedPulling="2026-03-20 17:42:13.37796893 +0000 UTC m=+1476.836000481" observedRunningTime="2026-03-20 17:42:13.787839569 +0000 UTC m=+1477.245871150" watchObservedRunningTime="2026-03-20 17:42:13.803787006 +0000 UTC m=+1477.261818547" Mar 20 17:42:14 crc kubenswrapper[4795]: I0320 17:42:14.782180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30c1ffc4-752a-4b0a-a95b-2bfbc458dc53","Type":"ContainerStarted","Data":"4866f5f245a841d3b385d7f1801c46ad49f8591588f65bbe7c1664b1fe275b87"} Mar 20 17:42:14 crc kubenswrapper[4795]: I0320 17:42:14.783030 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:42:14 crc kubenswrapper[4795]: I0320 17:42:14.819845 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.819818875 podStartE2EDuration="35.819818875s" podCreationTimestamp="2026-03-20 17:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:42:14.806804908 +0000 UTC m=+1478.264836489" watchObservedRunningTime="2026-03-20 17:42:14.819818875 +0000 UTC m=+1478.277850446" Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.049383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.050054 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.815304 4795 generic.go:334] "Generic (PLEG): container finished" podID="ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc" containerID="6c9bb98a5b27dfaab93f9b0bf86e2dc36843d779c16099e2338e7ce5f1541db7" exitCode=0 Mar 20 17:42:17 crc kubenswrapper[4795]: I0320 17:42:17.815386 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerDied","Data":"6c9bb98a5b27dfaab93f9b0bf86e2dc36843d779c16099e2338e7ce5f1541db7"} Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.128318 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:42:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:42:18 crc kubenswrapper[4795]: > Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.825036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc","Type":"ContainerStarted","Data":"20aa4d5b499482117c4d3947f9429d441904750ad7290a14f19d2ebe60c52bab"} Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.826311 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:42:18 crc kubenswrapper[4795]: I0320 17:42:18.858913 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.858891258 podStartE2EDuration="35.858891258s" podCreationTimestamp="2026-03-20 17:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:42:18.856181676 +0000 UTC m=+1482.314213237" watchObservedRunningTime="2026-03-20 17:42:18.858891258 +0000 UTC m=+1482.316922799" Mar 20 17:42:28 crc kubenswrapper[4795]: I0320 17:42:28.117232 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:42:28 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:42:28 crc kubenswrapper[4795]: > Mar 20 17:42:29 crc kubenswrapper[4795]: I0320 17:42:29.603015 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.076598 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk"] Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077089 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077107 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077128 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077135 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="init" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077160 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="init" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077169 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-utilities" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077176 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-utilities" Mar 20 17:42:30 crc kubenswrapper[4795]: E0320 17:42:30.077204 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-content" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077211 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="extract-content" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077422 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f2311f-ace5-4469-906b-05443d175f81" containerName="dnsmasq-dns" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.077441 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac35d627-20df-4aad-9779-e154f9cb617a" containerName="registry-server" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.078104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080191 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080498 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.080681 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.107423 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk"] Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.241541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343257 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.343370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.349096 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.349740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.351034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.359855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.417405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:30 crc kubenswrapper[4795]: I0320 17:42:30.946394 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk"] Mar 20 17:42:30 crc kubenswrapper[4795]: W0320 17:42:30.958830 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bbeb5c_0f49_4fb3_b0b4_57c9bf91977e.slice/crio-5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9 WatchSource:0}: Error finding container 5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9: Status 404 returned error can't find the container with id 5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9 Mar 20 17:42:31 crc kubenswrapper[4795]: I0320 17:42:31.938446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerStarted","Data":"5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9"} Mar 20 17:42:33 crc kubenswrapper[4795]: I0320 17:42:33.651019 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:42:38 crc kubenswrapper[4795]: I0320 17:42:38.096972 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" probeResult="failure" output=< Mar 20 17:42:38 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:42:38 crc kubenswrapper[4795]: > Mar 20 17:42:40 crc kubenswrapper[4795]: I0320 17:42:40.894883 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:42 crc kubenswrapper[4795]: I0320 17:42:42.029870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerStarted","Data":"46997074f9c4b9e4f4b38226f8f5a757e2619d320bdcf5429c1321ed40595a79"} Mar 20 17:42:42 crc kubenswrapper[4795]: I0320 17:42:42.057958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" podStartSLOduration=2.125776813 podStartE2EDuration="12.057940422s" podCreationTimestamp="2026-03-20 17:42:30 +0000 UTC" firstStartedPulling="2026-03-20 17:42:30.960493025 +0000 UTC m=+1494.418524566" lastFinishedPulling="2026-03-20 17:42:40.892656624 +0000 UTC m=+1504.350688175" observedRunningTime="2026-03-20 17:42:42.052473277 +0000 UTC m=+1505.510504818" watchObservedRunningTime="2026-03-20 17:42:42.057940422 +0000 UTC m=+1505.515971963" Mar 20 17:42:47 crc kubenswrapper[4795]: I0320 17:42:47.108499 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:47 crc kubenswrapper[4795]: I0320 17:42:47.187436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:47 crc kubenswrapper[4795]: I0320 17:42:47.382635 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.097827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lcjkp" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" containerID="cri-o://d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" gracePeriod=2 Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.685550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.761758 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") pod \"196de415-75ca-4b43-bb26-0a9a5a993b1e\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.761853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") pod \"196de415-75ca-4b43-bb26-0a9a5a993b1e\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.762061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") pod \"196de415-75ca-4b43-bb26-0a9a5a993b1e\" (UID: \"196de415-75ca-4b43-bb26-0a9a5a993b1e\") " Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.762675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities" (OuterVolumeSpecName: "utilities") pod "196de415-75ca-4b43-bb26-0a9a5a993b1e" (UID: "196de415-75ca-4b43-bb26-0a9a5a993b1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.768980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4" (OuterVolumeSpecName: "kube-api-access-rjss4") pod "196de415-75ca-4b43-bb26-0a9a5a993b1e" (UID: "196de415-75ca-4b43-bb26-0a9a5a993b1e"). InnerVolumeSpecName "kube-api-access-rjss4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.801893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:42:49 crc kubenswrapper[4795]: E0320 17:42:49.802320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" Mar 20 17:42:49 crc kubenswrapper[4795]: E0320 17:42:49.802355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-content" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-content" Mar 20 17:42:49 crc kubenswrapper[4795]: E0320 17:42:49.802407 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-utilities" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802417 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="extract-utilities" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.802626 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerName="registry-server" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.805362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.823551 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-utilities\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-catalog-content\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78b5w\" (UniqueName: \"kubernetes.io/projected/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-kube-api-access-78b5w\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864800 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjss4\" (UniqueName: \"kubernetes.io/projected/196de415-75ca-4b43-bb26-0a9a5a993b1e-kube-api-access-rjss4\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.864835 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.912139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "196de415-75ca-4b43-bb26-0a9a5a993b1e" (UID: "196de415-75ca-4b43-bb26-0a9a5a993b1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.966644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-utilities\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.966945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-catalog-content\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967085 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78b5w\" (UniqueName: \"kubernetes.io/projected/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-kube-api-access-78b5w\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967293 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/196de415-75ca-4b43-bb26-0a9a5a993b1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-utilities\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.967587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-catalog-content\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:49 crc kubenswrapper[4795]: I0320 17:42:49.985184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78b5w\" (UniqueName: \"kubernetes.io/projected/c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0-kube-api-access-78b5w\") pod \"community-operators-gtd2l\" (UID: \"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0\") " pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108049 4795 generic.go:334] "Generic (PLEG): container finished" podID="196de415-75ca-4b43-bb26-0a9a5a993b1e" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" exitCode=0 Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a"} Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lcjkp" event={"ID":"196de415-75ca-4b43-bb26-0a9a5a993b1e","Type":"ContainerDied","Data":"fabd95db1d654ea27cdf8aa4144ea02558f6e4e1468eabf7d4dab5f79064ffa1"} Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108148 4795 scope.go:117] "RemoveContainer" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.108282 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lcjkp" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.132855 4795 scope.go:117] "RemoveContainer" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.156680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.157232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.165412 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lcjkp"] Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.206967 4795 scope.go:117] "RemoveContainer" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.276922 4795 scope.go:117] "RemoveContainer" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" Mar 20 17:42:50 crc kubenswrapper[4795]: E0320 17:42:50.277705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a\": container with ID starting with d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a not found: ID does not exist" containerID="d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.277733 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a"} err="failed to get container status \"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a\": rpc error: code = NotFound desc = could not find container \"d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a\": container with ID starting with d4cdadbbd3f2ed8b4119e7d2c87c2eb0c00923d2facbe057c8a60aa4226eae0a not found: ID does not exist" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.277752 4795 scope.go:117] "RemoveContainer" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" Mar 20 17:42:50 crc kubenswrapper[4795]: E0320 17:42:50.278133 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561\": container with ID starting with 143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561 not found: ID does not exist" containerID="143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.278178 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561"} err="failed to get container status \"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561\": rpc error: code = NotFound desc = could not find container \"143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561\": container with ID starting with 143cd9070b23642af467dc287046225fad9a76bc6533a414d2dcdf67b6ab1561 not found: ID does not exist" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.278208 4795 scope.go:117] "RemoveContainer" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" Mar 20 17:42:50 crc kubenswrapper[4795]: E0320 17:42:50.278458 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa\": container with ID starting with d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa not found: ID does not exist" containerID="d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.278480 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa"} err="failed to get container status \"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa\": rpc error: code = NotFound desc = could not find container \"d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa\": container with ID starting with d308c18002cd90a1ad0e664f0afdcb7e93ccdf2b5c1e08bade0c920c2fcdbfaa not found: ID does not exist" Mar 20 17:42:50 crc kubenswrapper[4795]: I0320 17:42:50.706826 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.127018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerStarted","Data":"d7167042db96168386712bff371b2d9d2d5243adbff6d1143fb5d96c2d30f752"} Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.127113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerStarted","Data":"1e3d77ed281152e7161201d4f3863784c405b8d5680af3ee5b4cdedffa226744"} Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.228775 4795 scope.go:117] "RemoveContainer" containerID="a9b37f38da5a02a709d41d5d8718cdfaffaae9f225a892b8a803fd7f9d1c5b9d" Mar 20 17:42:51 crc kubenswrapper[4795]: I0320 17:42:51.262404 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196de415-75ca-4b43-bb26-0a9a5a993b1e" path="/var/lib/kubelet/pods/196de415-75ca-4b43-bb26-0a9a5a993b1e/volumes" Mar 20 17:42:52 crc kubenswrapper[4795]: I0320 17:42:52.144789 4795 generic.go:334] "Generic (PLEG): container finished" podID="c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0" containerID="d7167042db96168386712bff371b2d9d2d5243adbff6d1143fb5d96c2d30f752" exitCode=0 Mar 20 17:42:52 crc kubenswrapper[4795]: I0320 17:42:52.144873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerDied","Data":"d7167042db96168386712bff371b2d9d2d5243adbff6d1143fb5d96c2d30f752"} Mar 20 17:42:55 crc kubenswrapper[4795]: I0320 17:42:55.172321 4795 generic.go:334] "Generic (PLEG): container finished" podID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerID="46997074f9c4b9e4f4b38226f8f5a757e2619d320bdcf5429c1321ed40595a79" exitCode=0 Mar 20 17:42:55 crc kubenswrapper[4795]: I0320 17:42:55.172363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerDied","Data":"46997074f9c4b9e4f4b38226f8f5a757e2619d320bdcf5429c1321ed40595a79"} Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.396174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.563362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") pod \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\" (UID: \"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e\") " Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.572925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.573384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2" (OuterVolumeSpecName: "kube-api-access-r44j2") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "kube-api-access-r44j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.606542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory" (OuterVolumeSpecName: "inventory") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.627723 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" (UID: "e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668221 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668252 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668263 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:57 crc kubenswrapper[4795]: I0320 17:42:57.668272 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r44j2\" (UniqueName: \"kubernetes.io/projected/e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e-kube-api-access-r44j2\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.203121 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.203134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk" event={"ID":"e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e","Type":"ContainerDied","Data":"5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9"} Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.203290 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f5978baa7fdd7018b283540940fd4f0522fcf229adff2f6258d200ee99e61c9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.207640 4795 generic.go:334] "Generic (PLEG): container finished" podID="c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0" containerID="eb283ca4787e407dd208cf425b8606a032f09859fc48d162cab700175d647056" exitCode=0 Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.207716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerDied","Data":"eb283ca4787e407dd208cf425b8606a032f09859fc48d162cab700175d647056"} Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.510716 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9"] Mar 20 17:42:58 crc kubenswrapper[4795]: E0320 17:42:58.511280 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.511304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.511626 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.512515 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.515228 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.515255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.516188 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.518474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.532043 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9"] Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.687469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.688795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.689095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.791254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.791414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.791564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.800014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.807734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.816504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tx6d9\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:58 crc kubenswrapper[4795]: I0320 17:42:58.831475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:42:59 crc kubenswrapper[4795]: I0320 17:42:59.231121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9"] Mar 20 17:43:00 crc kubenswrapper[4795]: I0320 17:43:00.229675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtd2l" event={"ID":"c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0","Type":"ContainerStarted","Data":"f6ada8e1c8713492111085ea654640396f7aadd1a41c917769fc30db45524b2e"} Mar 20 17:43:00 crc kubenswrapper[4795]: I0320 17:43:00.231198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerStarted","Data":"70ece47a35ac9935a28b0de5740030bafff6f54285749e61a4fad5cb6bada8f0"} Mar 20 17:43:00 crc kubenswrapper[4795]: I0320 17:43:00.258914 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gtd2l" podStartSLOduration=4.450067076 podStartE2EDuration="11.258891691s" podCreationTimestamp="2026-03-20 17:42:49 +0000 UTC" firstStartedPulling="2026-03-20 17:42:52.146793331 +0000 UTC m=+1515.604824912" lastFinishedPulling="2026-03-20 17:42:58.955617976 +0000 UTC m=+1522.413649527" observedRunningTime="2026-03-20 17:43:00.250103306 +0000 UTC m=+1523.708134867" watchObservedRunningTime="2026-03-20 17:43:00.258891691 +0000 UTC m=+1523.716923232" Mar 20 17:43:01 crc kubenswrapper[4795]: I0320 17:43:01.241000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerStarted","Data":"36715ba352620b7792388f384fb4873d45c3ad7407a718da6d06c5542fddbcf2"} Mar 20 17:43:01 crc kubenswrapper[4795]: I0320 17:43:01.260005 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" podStartSLOduration=2.376708208 podStartE2EDuration="3.259987033s" podCreationTimestamp="2026-03-20 17:42:58 +0000 UTC" firstStartedPulling="2026-03-20 17:42:59.238441795 +0000 UTC m=+1522.696473346" lastFinishedPulling="2026-03-20 17:43:00.12172063 +0000 UTC m=+1523.579752171" observedRunningTime="2026-03-20 17:43:01.256777307 +0000 UTC m=+1524.714808848" watchObservedRunningTime="2026-03-20 17:43:01.259987033 +0000 UTC m=+1524.718018574" Mar 20 17:43:03 crc kubenswrapper[4795]: I0320 17:43:03.266091 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerID="36715ba352620b7792388f384fb4873d45c3ad7407a718da6d06c5542fddbcf2" exitCode=0 Mar 20 17:43:03 crc kubenswrapper[4795]: I0320 17:43:03.267056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerDied","Data":"36715ba352620b7792388f384fb4873d45c3ad7407a718da6d06c5542fddbcf2"} Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.713616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.913668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") pod \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.913779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") pod \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.913889 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") pod \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\" (UID: \"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936\") " Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.918321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq" (OuterVolumeSpecName: "kube-api-access-pjvtq") pod "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" (UID: "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936"). InnerVolumeSpecName "kube-api-access-pjvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.943140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" (UID: "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:43:04 crc kubenswrapper[4795]: I0320 17:43:04.952805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory" (OuterVolumeSpecName: "inventory") pod "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" (UID: "d7dc5d37-6d24-48ea-acc1-2b4ed3de6936"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.016654 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.016734 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjvtq\" (UniqueName: \"kubernetes.io/projected/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-kube-api-access-pjvtq\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.016749 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d7dc5d37-6d24-48ea-acc1-2b4ed3de6936-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.291336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" event={"ID":"d7dc5d37-6d24-48ea-acc1-2b4ed3de6936","Type":"ContainerDied","Data":"70ece47a35ac9935a28b0de5740030bafff6f54285749e61a4fad5cb6bada8f0"} Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.291709 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ece47a35ac9935a28b0de5740030bafff6f54285749e61a4fad5cb6bada8f0" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.291411 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tx6d9" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.374780 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps"] Mar 20 17:43:05 crc kubenswrapper[4795]: E0320 17:43:05.375159 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.375177 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.375361 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dc5d37-6d24-48ea-acc1-2b4ed3de6936" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.376060 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.382047 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.382453 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.382666 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.390053 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.427106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps"] Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.447904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.447969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.448068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.448445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.553245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.557758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.557969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.559188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.572136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-65dps\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:05 crc kubenswrapper[4795]: I0320 17:43:05.705838 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:43:06 crc kubenswrapper[4795]: I0320 17:43:06.289854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps"] Mar 20 17:43:07 crc kubenswrapper[4795]: I0320 17:43:07.316061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerStarted","Data":"8048e17374d65a5593e7f3026aacfa891127041d59875217add93959662e9cdc"} Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.158717 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.159304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.245619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.388025 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gtd2l" Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.568921 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtd2l"] Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.625363 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:43:10 crc kubenswrapper[4795]: I0320 17:43:10.625641 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tw8kt" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" containerID="cri-o://e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b" gracePeriod=2 Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.358394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerStarted","Data":"c179fa5fa1c857ee73ee0d25264e475b241069dffa52757206eb76081b38cae9"} Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.363862 4795 generic.go:334] "Generic (PLEG): container finished" podID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerID="e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b" exitCode=0 Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.364870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b"} Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.378090 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" podStartSLOduration=4.301181573 podStartE2EDuration="6.378076604s" podCreationTimestamp="2026-03-20 17:43:05 +0000 UTC" firstStartedPulling="2026-03-20 17:43:06.304700582 +0000 UTC m=+1529.762732113" lastFinishedPulling="2026-03-20 17:43:08.381595603 +0000 UTC m=+1531.839627144" observedRunningTime="2026-03-20 17:43:11.3742587 +0000 UTC m=+1534.832290251" watchObservedRunningTime="2026-03-20 17:43:11.378076604 +0000 UTC m=+1534.836108145" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.865950 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.885377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") pod \"0cba71d7-62e8-4541-9728-23dd5ff4b982\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.885446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") pod \"0cba71d7-62e8-4541-9728-23dd5ff4b982\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.885533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") pod \"0cba71d7-62e8-4541-9728-23dd5ff4b982\" (UID: \"0cba71d7-62e8-4541-9728-23dd5ff4b982\") " Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.886542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities" (OuterVolumeSpecName: "utilities") pod "0cba71d7-62e8-4541-9728-23dd5ff4b982" (UID: "0cba71d7-62e8-4541-9728-23dd5ff4b982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.897508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j" (OuterVolumeSpecName: "kube-api-access-m8j2j") pod "0cba71d7-62e8-4541-9728-23dd5ff4b982" (UID: "0cba71d7-62e8-4541-9728-23dd5ff4b982"). InnerVolumeSpecName "kube-api-access-m8j2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.943869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cba71d7-62e8-4541-9728-23dd5ff4b982" (UID: "0cba71d7-62e8-4541-9728-23dd5ff4b982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.988239 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.988270 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8j2j\" (UniqueName: \"kubernetes.io/projected/0cba71d7-62e8-4541-9728-23dd5ff4b982-kube-api-access-m8j2j\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:11 crc kubenswrapper[4795]: I0320 17:43:11.988280 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cba71d7-62e8-4541-9728-23dd5ff4b982-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.377763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tw8kt" event={"ID":"0cba71d7-62e8-4541-9728-23dd5ff4b982","Type":"ContainerDied","Data":"dc6b8994280ee1e1b27cf8f2c886374ff8bd967d27295a99223b5cd05c51c5e5"} Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.377838 4795 scope.go:117] "RemoveContainer" containerID="e51f3706cb85710070577b764255e613eb4bda5a66f7cf44046e8dca83ade02b" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.377882 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tw8kt" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.410166 4795 scope.go:117] "RemoveContainer" containerID="8963fe7721a09d9f6c228e790432497f6b1fff70d60afc4485e7fcd92391890f" Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.429826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.440910 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tw8kt"] Mar 20 17:43:12 crc kubenswrapper[4795]: I0320 17:43:12.457298 4795 scope.go:117] "RemoveContainer" containerID="099eb6fe1b44619943ee789acf319c90001ea00f649ef59a36a0aa98e76bd549" Mar 20 17:43:13 crc kubenswrapper[4795]: I0320 17:43:13.261099 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" path="/var/lib/kubelet/pods/0cba71d7-62e8-4541-9728-23dd5ff4b982/volumes" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.411568 4795 scope.go:117] "RemoveContainer" containerID="b72c1a68c2af640e452031ab226e7a764d4d714ebc4a58907bac640f8e0500bb" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.520603 4795 scope.go:117] "RemoveContainer" containerID="9df2c204a93d51c554ceaf159d1f9366b95bed6cc7f2757ae8ae8edae396f498" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.552542 4795 scope.go:117] "RemoveContainer" containerID="64da53ed17d6e7c8ed644863f568fc0f6e5e946972ad8fd66ba6db39c157b1e6" Mar 20 17:43:51 crc kubenswrapper[4795]: I0320 17:43:51.614280 4795 scope.go:117] "RemoveContainer" containerID="788d4e8fa3d02d5dbc18ca88de94a81c7f1ea8b686ca6f20236e85aa9366458c" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.153884 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:44:00 crc kubenswrapper[4795]: E0320 17:44:00.155061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4795]: E0320 17:44:00.155111 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155122 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4795]: E0320 17:44:00.155143 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155155 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.155475 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cba71d7-62e8-4541-9728-23dd5ff4b982" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.156426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.159329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.159635 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.160331 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.168388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.279529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"auto-csr-approver-29567144-khl6t\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.381129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"auto-csr-approver-29567144-khl6t\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.418847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"auto-csr-approver-29567144-khl6t\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.477659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.944276 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:44:00 crc kubenswrapper[4795]: I0320 17:44:00.954735 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:44:01 crc kubenswrapper[4795]: I0320 17:44:01.940929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-khl6t" event={"ID":"a6396cd8-bc19-4f24-ae36-12356bfa8133","Type":"ContainerStarted","Data":"24b1cd71bef426c5b45ef08818327c53e150c64c24e186695f96c00266ca4afc"} Mar 20 17:44:03 crc kubenswrapper[4795]: I0320 17:44:03.960053 4795 generic.go:334] "Generic (PLEG): container finished" podID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerID="29087c37b0e22594df358a498bb26205f2050bb1e4a607372b3a2ba3b4df8dd7" exitCode=0 Mar 20 17:44:03 crc kubenswrapper[4795]: I0320 17:44:03.960109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-khl6t" event={"ID":"a6396cd8-bc19-4f24-ae36-12356bfa8133","Type":"ContainerDied","Data":"29087c37b0e22594df358a498bb26205f2050bb1e4a607372b3a2ba3b4df8dd7"} Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.282975 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.380548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") pod \"a6396cd8-bc19-4f24-ae36-12356bfa8133\" (UID: \"a6396cd8-bc19-4f24-ae36-12356bfa8133\") " Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.387216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd" (OuterVolumeSpecName: "kube-api-access-hn6bd") pod "a6396cd8-bc19-4f24-ae36-12356bfa8133" (UID: "a6396cd8-bc19-4f24-ae36-12356bfa8133"). InnerVolumeSpecName "kube-api-access-hn6bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.483294 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn6bd\" (UniqueName: \"kubernetes.io/projected/a6396cd8-bc19-4f24-ae36-12356bfa8133-kube-api-access-hn6bd\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.976874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-khl6t" event={"ID":"a6396cd8-bc19-4f24-ae36-12356bfa8133","Type":"ContainerDied","Data":"24b1cd71bef426c5b45ef08818327c53e150c64c24e186695f96c00266ca4afc"} Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.976927 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b1cd71bef426c5b45ef08818327c53e150c64c24e186695f96c00266ca4afc" Mar 20 17:44:05 crc kubenswrapper[4795]: I0320 17:44:05.976995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-khl6t" Mar 20 17:44:06 crc kubenswrapper[4795]: I0320 17:44:06.353449 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:44:06 crc kubenswrapper[4795]: I0320 17:44:06.361719 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-7flct"] Mar 20 17:44:07 crc kubenswrapper[4795]: I0320 17:44:07.265168 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83d2a1a-2b3b-409a-997a-672e322b1d8e" path="/var/lib/kubelet/pods/e83d2a1a-2b3b-409a-997a-672e322b1d8e/volumes" Mar 20 17:44:41 crc kubenswrapper[4795]: I0320 17:44:41.300635 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:44:41 crc kubenswrapper[4795]: I0320 17:44:41.301232 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:44:51 crc kubenswrapper[4795]: I0320 17:44:51.827675 4795 scope.go:117] "RemoveContainer" containerID="27cb2cc4ca0cf03af5e4f56a72a8901b4a28c70c5abb54e1f86d55c8053dcc74" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.170835 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 17:45:00 crc kubenswrapper[4795]: E0320 17:45:00.172483 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.172511 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.172766 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" containerName="oc" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.173899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.176735 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.176757 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.191097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.311043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.311476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.311534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.413382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.413542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.413595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.414810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.420446 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.432797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"collect-profiles-29567145-fpxpb\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:00 crc kubenswrapper[4795]: I0320 17:45:00.518414 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.016417 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.604836 4795 generic.go:334] "Generic (PLEG): container finished" podID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerID="7e78ac608afa56e8111695b336413ee802aca06929422f0042e8a413df5d1f4a" exitCode=0 Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.604941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" event={"ID":"cd60241d-b207-4a9a-86b6-3be32ab282d3","Type":"ContainerDied","Data":"7e78ac608afa56e8111695b336413ee802aca06929422f0042e8a413df5d1f4a"} Mar 20 17:45:01 crc kubenswrapper[4795]: I0320 17:45:01.605157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" event={"ID":"cd60241d-b207-4a9a-86b6-3be32ab282d3","Type":"ContainerStarted","Data":"81730fe9e4abf54b7e51797503af0b395a54f504426180c9e760735101e8ae5e"} Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.085860 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.269412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") pod \"cd60241d-b207-4a9a-86b6-3be32ab282d3\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.269736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") pod \"cd60241d-b207-4a9a-86b6-3be32ab282d3\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.269783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") pod \"cd60241d-b207-4a9a-86b6-3be32ab282d3\" (UID: \"cd60241d-b207-4a9a-86b6-3be32ab282d3\") " Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.270461 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd60241d-b207-4a9a-86b6-3be32ab282d3" (UID: "cd60241d-b207-4a9a-86b6-3be32ab282d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.277455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cd60241d-b207-4a9a-86b6-3be32ab282d3" (UID: "cd60241d-b207-4a9a-86b6-3be32ab282d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.283816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx" (OuterVolumeSpecName: "kube-api-access-llqvx") pod "cd60241d-b207-4a9a-86b6-3be32ab282d3" (UID: "cd60241d-b207-4a9a-86b6-3be32ab282d3"). InnerVolumeSpecName "kube-api-access-llqvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.371988 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd60241d-b207-4a9a-86b6-3be32ab282d3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.372126 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd60241d-b207-4a9a-86b6-3be32ab282d3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.372308 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llqvx\" (UniqueName: \"kubernetes.io/projected/cd60241d-b207-4a9a-86b6-3be32ab282d3-kube-api-access-llqvx\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.634046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" event={"ID":"cd60241d-b207-4a9a-86b6-3be32ab282d3","Type":"ContainerDied","Data":"81730fe9e4abf54b7e51797503af0b395a54f504426180c9e760735101e8ae5e"} Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.634105 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81730fe9e4abf54b7e51797503af0b395a54f504426180c9e760735101e8ae5e" Mar 20 17:45:03 crc kubenswrapper[4795]: I0320 17:45:03.634185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb" Mar 20 17:45:11 crc kubenswrapper[4795]: I0320 17:45:11.300206 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:45:11 crc kubenswrapper[4795]: I0320 17:45:11.301083 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.300883 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.301638 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.301755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.303064 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:45:41 crc kubenswrapper[4795]: I0320 17:45:41.303191 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" gracePeriod=600 Mar 20 17:45:41 crc kubenswrapper[4795]: E0320 17:45:41.445476 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.050638 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" exitCode=0 Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.050786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48"} Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.050836 4795 scope.go:117] "RemoveContainer" containerID="c7dff0812ac3b9b01ead440177268d04495d786b2df7ccfdd3450c0c48599cc1" Mar 20 17:45:42 crc kubenswrapper[4795]: I0320 17:45:42.051428 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:45:42 crc kubenswrapper[4795]: E0320 17:45:42.051666 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.919108 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:45 crc kubenswrapper[4795]: E0320 17:45:45.920218 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerName="collect-profiles" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.920232 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerName="collect-profiles" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.920430 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" containerName="collect-profiles" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.924535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:45 crc kubenswrapper[4795]: I0320 17:45:45.945067 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.075644 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.075838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.076037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.178775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.211932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"redhat-marketplace-lqwpr\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.246298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:46 crc kubenswrapper[4795]: I0320 17:45:46.727323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:47 crc kubenswrapper[4795]: I0320 17:45:47.097714 4795 generic.go:334] "Generic (PLEG): container finished" podID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerID="9c4ccb984fcea0c869be462be23965db4e7b09dccf48c35e2fcdcc062beca428" exitCode=0 Mar 20 17:45:47 crc kubenswrapper[4795]: I0320 17:45:47.097757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"9c4ccb984fcea0c869be462be23965db4e7b09dccf48c35e2fcdcc062beca428"} Mar 20 17:45:47 crc kubenswrapper[4795]: I0320 17:45:47.097797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerStarted","Data":"fac724dbd822f9dfabd9959c9d8174e232247e72a2949be62d1ad30b2c3b4ba9"} Mar 20 17:45:49 crc kubenswrapper[4795]: I0320 17:45:49.117876 4795 generic.go:334] "Generic (PLEG): container finished" podID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerID="ffd6f57295f980f22dcb8cb678472859d50f02ddbc01cd1b7e0bbaa6f1d24f8d" exitCode=0 Mar 20 17:45:49 crc kubenswrapper[4795]: I0320 17:45:49.117976 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"ffd6f57295f980f22dcb8cb678472859d50f02ddbc01cd1b7e0bbaa6f1d24f8d"} Mar 20 17:45:50 crc kubenswrapper[4795]: I0320 17:45:50.130602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerStarted","Data":"57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e"} Mar 20 17:45:55 crc kubenswrapper[4795]: I0320 17:45:55.255213 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:45:55 crc kubenswrapper[4795]: E0320 17:45:55.256321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.247384 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.247466 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.310766 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:56 crc kubenswrapper[4795]: I0320 17:45:56.335457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqwpr" podStartSLOduration=8.708193489 podStartE2EDuration="11.335434024s" podCreationTimestamp="2026-03-20 17:45:45 +0000 UTC" firstStartedPulling="2026-03-20 17:45:47.100707401 +0000 UTC m=+1690.558738942" lastFinishedPulling="2026-03-20 17:45:49.727947926 +0000 UTC m=+1693.185979477" observedRunningTime="2026-03-20 17:45:50.157599398 +0000 UTC m=+1693.615630939" watchObservedRunningTime="2026-03-20 17:45:56.335434024 +0000 UTC m=+1699.793465575" Mar 20 17:45:57 crc kubenswrapper[4795]: I0320 17:45:57.265972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:45:57 crc kubenswrapper[4795]: I0320 17:45:57.332887 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:45:59 crc kubenswrapper[4795]: I0320 17:45:59.226340 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqwpr" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" containerID="cri-o://57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e" gracePeriod=2 Mar 20 17:45:59 crc kubenswrapper[4795]: E0320 17:45:59.799448 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf700090f_cd79_4419_aeb1_7cf66ba3fcf5.slice/crio-conmon-57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.161221 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.166158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.173230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.173414 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.173597 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.178973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.238165 4795 generic.go:334] "Generic (PLEG): container finished" podID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerID="57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e" exitCode=0 Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.238214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e"} Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.343837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"auto-csr-approver-29567146-xzfkq\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.440230 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.445321 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"auto-csr-approver-29567146-xzfkq\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.469102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"auto-csr-approver-29567146-xzfkq\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.500800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.546410 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") pod \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.546586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") pod \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.546715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") pod \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\" (UID: \"f700090f-cd79-4419-aeb1-7cf66ba3fcf5\") " Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.547519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities" (OuterVolumeSpecName: "utilities") pod "f700090f-cd79-4419-aeb1-7cf66ba3fcf5" (UID: "f700090f-cd79-4419-aeb1-7cf66ba3fcf5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.549440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px" (OuterVolumeSpecName: "kube-api-access-292px") pod "f700090f-cd79-4419-aeb1-7cf66ba3fcf5" (UID: "f700090f-cd79-4419-aeb1-7cf66ba3fcf5"). InnerVolumeSpecName "kube-api-access-292px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.586833 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f700090f-cd79-4419-aeb1-7cf66ba3fcf5" (UID: "f700090f-cd79-4419-aeb1-7cf66ba3fcf5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.648882 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.648912 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.648923 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-292px\" (UniqueName: \"kubernetes.io/projected/f700090f-cd79-4419-aeb1-7cf66ba3fcf5-kube-api-access-292px\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:00 crc kubenswrapper[4795]: W0320 17:46:00.979794 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ce1ddf5_f6e1_40ab_926d_4cf03d502e9c.slice/crio-3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372 WatchSource:0}: Error finding container 3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372: Status 404 returned error can't find the container with id 3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372 Mar 20 17:46:00 crc kubenswrapper[4795]: I0320 17:46:00.980182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.280362 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqwpr" Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.293599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerStarted","Data":"3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372"} Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.293662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqwpr" event={"ID":"f700090f-cd79-4419-aeb1-7cf66ba3fcf5","Type":"ContainerDied","Data":"fac724dbd822f9dfabd9959c9d8174e232247e72a2949be62d1ad30b2c3b4ba9"} Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.293750 4795 scope.go:117] "RemoveContainer" containerID="57a92f6427fbbff601384dd02d0a818a1c7453bb383b5400b15f8a7d44e8f57e" Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.339608 4795 scope.go:117] "RemoveContainer" containerID="ffd6f57295f980f22dcb8cb678472859d50f02ddbc01cd1b7e0bbaa6f1d24f8d" Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.342794 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.366095 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqwpr"] Mar 20 17:46:01 crc kubenswrapper[4795]: I0320 17:46:01.382577 4795 scope.go:117] "RemoveContainer" containerID="9c4ccb984fcea0c869be462be23965db4e7b09dccf48c35e2fcdcc062beca428" Mar 20 17:46:02 crc kubenswrapper[4795]: I0320 17:46:02.294975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerStarted","Data":"19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51"} Mar 20 17:46:02 crc kubenswrapper[4795]: I0320 17:46:02.317539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" podStartSLOduration=1.4386096959999999 podStartE2EDuration="2.31751409s" podCreationTimestamp="2026-03-20 17:46:00 +0000 UTC" firstStartedPulling="2026-03-20 17:46:00.983049483 +0000 UTC m=+1704.441081034" lastFinishedPulling="2026-03-20 17:46:01.861953887 +0000 UTC m=+1705.319985428" observedRunningTime="2026-03-20 17:46:02.308763486 +0000 UTC m=+1705.766795077" watchObservedRunningTime="2026-03-20 17:46:02.31751409 +0000 UTC m=+1705.775545631" Mar 20 17:46:03 crc kubenswrapper[4795]: I0320 17:46:03.270664 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" path="/var/lib/kubelet/pods/f700090f-cd79-4419-aeb1-7cf66ba3fcf5/volumes" Mar 20 17:46:03 crc kubenswrapper[4795]: I0320 17:46:03.307197 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerID="19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51" exitCode=0 Mar 20 17:46:03 crc kubenswrapper[4795]: I0320 17:46:03.307242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerDied","Data":"19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51"} Mar 20 17:46:04 crc kubenswrapper[4795]: I0320 17:46:04.790721 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:04 crc kubenswrapper[4795]: I0320 17:46:04.963673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") pod \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\" (UID: \"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c\") " Mar 20 17:46:04 crc kubenswrapper[4795]: I0320 17:46:04.968310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g" (OuterVolumeSpecName: "kube-api-access-5s57g") pod "6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" (UID: "6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c"). InnerVolumeSpecName "kube-api-access-5s57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.065458 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s57g\" (UniqueName: \"kubernetes.io/projected/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c-kube-api-access-5s57g\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.328854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" event={"ID":"6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c","Type":"ContainerDied","Data":"3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372"} Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.328896 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cd39a127894f56d7f5def0980d153e45440ffb887739361dbdac6c85cc4d372" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.328952 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-xzfkq" Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.398822 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:46:05 crc kubenswrapper[4795]: I0320 17:46:05.409146 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-s5wtb"] Mar 20 17:46:07 crc kubenswrapper[4795]: I0320 17:46:07.270866 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e6fe9e-d22e-420c-b050-a00a53749f1f" path="/var/lib/kubelet/pods/f9e6fe9e-d22e-420c-b050-a00a53749f1f/volumes" Mar 20 17:46:10 crc kubenswrapper[4795]: I0320 17:46:10.252541 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:10 crc kubenswrapper[4795]: E0320 17:46:10.253317 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:22 crc kubenswrapper[4795]: I0320 17:46:22.252559 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:22 crc kubenswrapper[4795]: E0320 17:46:22.253748 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:32 crc kubenswrapper[4795]: I0320 17:46:32.637603 4795 generic.go:334] "Generic (PLEG): container finished" podID="0708214e-e711-465a-a54e-97a462b2777e" containerID="c179fa5fa1c857ee73ee0d25264e475b241069dffa52757206eb76081b38cae9" exitCode=0 Mar 20 17:46:32 crc kubenswrapper[4795]: I0320 17:46:32.637741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerDied","Data":"c179fa5fa1c857ee73ee0d25264e475b241069dffa52757206eb76081b38cae9"} Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.212809 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.252357 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.252581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264881 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.264914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") pod \"0708214e-e711-465a-a54e-97a462b2777e\" (UID: \"0708214e-e711-465a-a54e-97a462b2777e\") " Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.269662 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh" (OuterVolumeSpecName: "kube-api-access-kzxfh") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "kube-api-access-kzxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.270906 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.291465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory" (OuterVolumeSpecName: "inventory") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.320267 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0708214e-e711-465a-a54e-97a462b2777e" (UID: "0708214e-e711-465a-a54e-97a462b2777e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372111 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzxfh\" (UniqueName: \"kubernetes.io/projected/0708214e-e711-465a-a54e-97a462b2777e-kube-api-access-kzxfh\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372169 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372197 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.372225 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0708214e-e711-465a-a54e-97a462b2777e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.666593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" event={"ID":"0708214e-e711-465a-a54e-97a462b2777e","Type":"ContainerDied","Data":"8048e17374d65a5593e7f3026aacfa891127041d59875217add93959662e9cdc"} Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.666654 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8048e17374d65a5593e7f3026aacfa891127041d59875217add93959662e9cdc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.667030 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-65dps" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.787838 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k"] Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788274 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-content" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788296 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-content" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788315 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0708214e-e711-465a-a54e-97a462b2777e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0708214e-e711-465a-a54e-97a462b2777e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788364 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788384 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerName="oc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788394 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerName="oc" Mar 20 17:46:34 crc kubenswrapper[4795]: E0320 17:46:34.788409 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-utilities" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788417 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="extract-utilities" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788634 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0708214e-e711-465a-a54e-97a462b2777e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788657 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f700090f-cd79-4419-aeb1-7cf66ba3fcf5" containerName="registry-server" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.788674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" containerName="oc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.789395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.791598 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.792134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.792704 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.793473 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.797636 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k"] Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.889088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.889310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.889363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.991255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.991650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.991787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.998308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:34 crc kubenswrapper[4795]: I0320 17:46:34.998492 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:35 crc kubenswrapper[4795]: I0320 17:46:35.025044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:35 crc kubenswrapper[4795]: I0320 17:46:35.122904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:46:35 crc kubenswrapper[4795]: I0320 17:46:35.719383 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k"] Mar 20 17:46:36 crc kubenswrapper[4795]: I0320 17:46:36.690022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerStarted","Data":"332d037bd56e3c7a734160d494102af123e05d9ad0b58769a38e86095019dcc7"} Mar 20 17:46:36 crc kubenswrapper[4795]: I0320 17:46:36.690350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerStarted","Data":"6c07a2c08823474a88320c9dd611604352c7622a34255df23ca241659b07db6b"} Mar 20 17:46:36 crc kubenswrapper[4795]: I0320 17:46:36.716903 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" podStartSLOduration=2.192368864 podStartE2EDuration="2.716879227s" podCreationTimestamp="2026-03-20 17:46:34 +0000 UTC" firstStartedPulling="2026-03-20 17:46:35.714667032 +0000 UTC m=+1739.172698583" lastFinishedPulling="2026-03-20 17:46:36.239177395 +0000 UTC m=+1739.697208946" observedRunningTime="2026-03-20 17:46:36.70865215 +0000 UTC m=+1740.166683701" watchObservedRunningTime="2026-03-20 17:46:36.716879227 +0000 UTC m=+1740.174910778" Mar 20 17:46:47 crc kubenswrapper[4795]: I0320 17:46:47.266220 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:47 crc kubenswrapper[4795]: E0320 17:46:47.267201 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:46:51 crc kubenswrapper[4795]: I0320 17:46:51.966387 4795 scope.go:117] "RemoveContainer" containerID="8ac90f4263985d5a19d6f00ac01d70eb81681c5a298c4e2c5302052e573286a6" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.065248 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.084829 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.097854 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.108595 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.117573 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.127567 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-30ae-account-create-update-d79gp"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.136316 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-c5rg6"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.145963 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ncfp9"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.153763 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8d96q"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.161487 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a409-account-create-update-zvscf"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.169671 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.180745 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1681-account-create-update-vpwb2"] Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.266405 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c13d27e-7afd-4113-9b7d-fcbf7eb87c87" path="/var/lib/kubelet/pods/1c13d27e-7afd-4113-9b7d-fcbf7eb87c87/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.267402 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389c1f10-5aba-4c4d-b0b3-3a38f6038536" path="/var/lib/kubelet/pods/389c1f10-5aba-4c4d-b0b3-3a38f6038536/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.268195 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aac28d5-6b58-424e-83f8-ec71c53e41ce" path="/var/lib/kubelet/pods/6aac28d5-6b58-424e-83f8-ec71c53e41ce/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.269057 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfb1ea8-a8d2-4152-ad18-54d380b289c4" path="/var/lib/kubelet/pods/acfb1ea8-a8d2-4152-ad18-54d380b289c4/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.271058 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9265b8c-0b80-47d9-8f4b-3d996233341e" path="/var/lib/kubelet/pods/c9265b8c-0b80-47d9-8f4b-3d996233341e/volumes" Mar 20 17:46:57 crc kubenswrapper[4795]: I0320 17:46:57.272499 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc" path="/var/lib/kubelet/pods/f0e1d5fd-0ec6-440f-a31e-fb30d5d70ffc/volumes" Mar 20 17:46:59 crc kubenswrapper[4795]: I0320 17:46:59.253121 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:46:59 crc kubenswrapper[4795]: E0320 17:46:59.253622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:12 crc kubenswrapper[4795]: I0320 17:47:12.252755 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:12 crc kubenswrapper[4795]: E0320 17:47:12.253876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:18 crc kubenswrapper[4795]: I0320 17:47:18.079922 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:47:18 crc kubenswrapper[4795]: I0320 17:47:18.101170 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d7ffs"] Mar 20 17:47:19 crc kubenswrapper[4795]: I0320 17:47:19.275080 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc63f125-2d90-43df-a863-b85fb2eb690e" path="/var/lib/kubelet/pods/fc63f125-2d90-43df-a863-b85fb2eb690e/volumes" Mar 20 17:47:25 crc kubenswrapper[4795]: I0320 17:47:25.042838 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:47:25 crc kubenswrapper[4795]: I0320 17:47:25.057263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sv5fz"] Mar 20 17:47:25 crc kubenswrapper[4795]: I0320 17:47:25.266031 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e951c331-872c-41b6-b747-d5129b8c0a1b" path="/var/lib/kubelet/pods/e951c331-872c-41b6-b747-d5129b8c0a1b/volumes" Mar 20 17:47:27 crc kubenswrapper[4795]: I0320 17:47:27.286925 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:27 crc kubenswrapper[4795]: E0320 17:47:27.288115 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:38 crc kubenswrapper[4795]: I0320 17:47:38.252240 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:38 crc kubenswrapper[4795]: E0320 17:47:38.255958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.069891 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.086051 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2xqcf"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.097873 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.107894 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fe40-account-create-update-jh9t8"] Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.271785 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff9ec79-6bd9-470e-8a75-8df1f3c52851" path="/var/lib/kubelet/pods/9ff9ec79-6bd9-470e-8a75-8df1f3c52851/volumes" Mar 20 17:47:41 crc kubenswrapper[4795]: I0320 17:47:41.273089 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e065e2d4-096b-426b-a1f8-14311adb7cbc" path="/var/lib/kubelet/pods/e065e2d4-096b-426b-a1f8-14311adb7cbc/volumes" Mar 20 17:47:42 crc kubenswrapper[4795]: I0320 17:47:42.025492 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:47:42 crc kubenswrapper[4795]: I0320 17:47:42.031840 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-b5nks"] Mar 20 17:47:43 crc kubenswrapper[4795]: I0320 17:47:43.274765 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373ddf98-d9da-4f1f-a6be-3d16e3cbad57" path="/var/lib/kubelet/pods/373ddf98-d9da-4f1f-a6be-3d16e3cbad57/volumes" Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.050953 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.066925 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.076855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0ac3-account-create-update-km4zq"] Mar 20 17:47:44 crc kubenswrapper[4795]: I0320 17:47:44.086524 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-d5tx6"] Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.042773 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.057512 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c60c-account-create-update-hhzrt"] Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.270208 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b1c5f0-e7fb-44b7-8c75-c8036f371c56" path="/var/lib/kubelet/pods/18b1c5f0-e7fb-44b7-8c75-c8036f371c56/volumes" Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.271615 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0847a4-54b5-4068-bfa8-730a19e96d9c" path="/var/lib/kubelet/pods/1c0847a4-54b5-4068-bfa8-730a19e96d9c/volumes" Mar 20 17:47:45 crc kubenswrapper[4795]: I0320 17:47:45.272919 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d71698-4dc2-448a-9330-23372e2d508b" path="/var/lib/kubelet/pods/36d71698-4dc2-448a-9330-23372e2d508b/volumes" Mar 20 17:47:49 crc kubenswrapper[4795]: I0320 17:47:49.034862 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:47:49 crc kubenswrapper[4795]: I0320 17:47:49.041599 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dwhh5"] Mar 20 17:47:49 crc kubenswrapper[4795]: I0320 17:47:49.270956 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b15c724-622b-4da7-96a3-01949d04ecac" path="/var/lib/kubelet/pods/7b15c724-622b-4da7-96a3-01949d04ecac/volumes" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.110965 4795 scope.go:117] "RemoveContainer" containerID="3e3072d7a6a60ff440da8ec24082885e62958e6ce5ded9fd9910a3d0c2817a07" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.168748 4795 scope.go:117] "RemoveContainer" containerID="1fe0f6a7ba267ec0588c7d4179b78569ace69ae42af4f4ce02a9e28bfc87aa93" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.235398 4795 scope.go:117] "RemoveContainer" containerID="0bd7daab0116804ff4450e365b81c7b208a20da2cd4b665ce83729724da32638" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.298079 4795 scope.go:117] "RemoveContainer" containerID="497d569160d86f7ef365c3f9c537432bd00933f71438ea39707377d46eebd046" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.324754 4795 scope.go:117] "RemoveContainer" containerID="fa7f816c765d44ed743198c38348dd663b04f7cfc3b7f6aac5dffa2623d4db45" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.365291 4795 scope.go:117] "RemoveContainer" containerID="470c232d3b8dc5c0134ac3e2610bcf258029ab0b51ac60e2a7728f94a3beb865" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.418631 4795 scope.go:117] "RemoveContainer" containerID="115e8aa5e635a588311da0792150e7730feaab865eb0acb01117eb70b42bfde3" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.437660 4795 scope.go:117] "RemoveContainer" containerID="d85d02558764daff4d2300daa1f7a51dd79d0b89452cbb1821643bd1f3d0ff3c" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.458549 4795 scope.go:117] "RemoveContainer" containerID="0f5ef18005b655abcc8e4883b9bee8538648f3cf86fe68a6e17cb1ecb194c52e" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.491574 4795 scope.go:117] "RemoveContainer" containerID="2d63356a6d0232331bb76203b1359e46e0f2a21a5ebc5f3160865388f8cf9a1c" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.520210 4795 scope.go:117] "RemoveContainer" containerID="e0dfbaccbeeb5b8f99fb5498e364810e7c89661123cf8b487af69c7d6020134e" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.552276 4795 scope.go:117] "RemoveContainer" containerID="c288b4ff895d130555ade7ce513d591493310d7fb3678ee47968d204fe11297a" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.579860 4795 scope.go:117] "RemoveContainer" containerID="5b477a108858cabbe8510a1a17d7f7ac3c69ce053e8fe87204336bad4594bfc0" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.633369 4795 scope.go:117] "RemoveContainer" containerID="5d693b98a616da996bc733e3508b576b31f68d8eb1c9fc7b9800283fac04b343" Mar 20 17:47:52 crc kubenswrapper[4795]: I0320 17:47:52.666763 4795 scope.go:117] "RemoveContainer" containerID="61c0a7747547de21c917366527d52306c78b302a64918960d0e832416be0ca0f" Mar 20 17:47:53 crc kubenswrapper[4795]: I0320 17:47:53.253404 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:47:53 crc kubenswrapper[4795]: E0320 17:47:53.254326 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.175409 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.178776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.184940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.189433 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.190334 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.213659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"auto-csr-approver-29567148-nqw6d\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.242105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.315756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"auto-csr-approver-29567148-nqw6d\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.344587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"auto-csr-approver-29567148-nqw6d\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:00 crc kubenswrapper[4795]: I0320 17:48:00.524646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:01 crc kubenswrapper[4795]: I0320 17:48:01.023594 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:48:01 crc kubenswrapper[4795]: I0320 17:48:01.724588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" event={"ID":"5541d8b2-57fb-4162-8ee0-ac6630a5d91c","Type":"ContainerStarted","Data":"4c481ef9a098bd5699116bb215caa2af24c09ad9a40e254d4b9b5b9ecdd1f8d5"} Mar 20 17:48:03 crc kubenswrapper[4795]: I0320 17:48:03.750247 4795 generic.go:334] "Generic (PLEG): container finished" podID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerID="29ff925c9ace295b0e664d8ded17085588e84c04311df585ee77cb9a00150d0d" exitCode=0 Mar 20 17:48:03 crc kubenswrapper[4795]: I0320 17:48:03.750346 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" event={"ID":"5541d8b2-57fb-4162-8ee0-ac6630a5d91c","Type":"ContainerDied","Data":"29ff925c9ace295b0e664d8ded17085588e84c04311df585ee77cb9a00150d0d"} Mar 20 17:48:04 crc kubenswrapper[4795]: I0320 17:48:04.252806 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:04 crc kubenswrapper[4795]: E0320 17:48:04.253283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.079064 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.146484 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") pod \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\" (UID: \"5541d8b2-57fb-4162-8ee0-ac6630a5d91c\") " Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.153529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2" (OuterVolumeSpecName: "kube-api-access-shfp2") pod "5541d8b2-57fb-4162-8ee0-ac6630a5d91c" (UID: "5541d8b2-57fb-4162-8ee0-ac6630a5d91c"). InnerVolumeSpecName "kube-api-access-shfp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.248231 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfp2\" (UniqueName: \"kubernetes.io/projected/5541d8b2-57fb-4162-8ee0-ac6630a5d91c-kube-api-access-shfp2\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.781477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" event={"ID":"5541d8b2-57fb-4162-8ee0-ac6630a5d91c","Type":"ContainerDied","Data":"4c481ef9a098bd5699116bb215caa2af24c09ad9a40e254d4b9b5b9ecdd1f8d5"} Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.781537 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c481ef9a098bd5699116bb215caa2af24c09ad9a40e254d4b9b5b9ecdd1f8d5" Mar 20 17:48:05 crc kubenswrapper[4795]: I0320 17:48:05.781536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-nqw6d" Mar 20 17:48:06 crc kubenswrapper[4795]: I0320 17:48:06.154785 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:48:06 crc kubenswrapper[4795]: I0320 17:48:06.165367 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-zrn58"] Mar 20 17:48:07 crc kubenswrapper[4795]: I0320 17:48:07.280670 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df931d18-2dae-408e-823d-45c28b0a31c2" path="/var/lib/kubelet/pods/df931d18-2dae-408e-823d-45c28b0a31c2/volumes" Mar 20 17:48:12 crc kubenswrapper[4795]: I0320 17:48:12.863833 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerID="332d037bd56e3c7a734160d494102af123e05d9ad0b58769a38e86095019dcc7" exitCode=0 Mar 20 17:48:12 crc kubenswrapper[4795]: I0320 17:48:12.864532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerDied","Data":"332d037bd56e3c7a734160d494102af123e05d9ad0b58769a38e86095019dcc7"} Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.459006 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.550818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") pod \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.550976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") pod \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.551155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") pod \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\" (UID: \"b0af5324-4ba3-4a12-9fdb-b467918ba19d\") " Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.557852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp" (OuterVolumeSpecName: "kube-api-access-pw6mp") pod "b0af5324-4ba3-4a12-9fdb-b467918ba19d" (UID: "b0af5324-4ba3-4a12-9fdb-b467918ba19d"). InnerVolumeSpecName "kube-api-access-pw6mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.577757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0af5324-4ba3-4a12-9fdb-b467918ba19d" (UID: "b0af5324-4ba3-4a12-9fdb-b467918ba19d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.595389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory" (OuterVolumeSpecName: "inventory") pod "b0af5324-4ba3-4a12-9fdb-b467918ba19d" (UID: "b0af5324-4ba3-4a12-9fdb-b467918ba19d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.654610 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.654647 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw6mp\" (UniqueName: \"kubernetes.io/projected/b0af5324-4ba3-4a12-9fdb-b467918ba19d-kube-api-access-pw6mp\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.654664 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0af5324-4ba3-4a12-9fdb-b467918ba19d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.895109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" event={"ID":"b0af5324-4ba3-4a12-9fdb-b467918ba19d","Type":"ContainerDied","Data":"6c07a2c08823474a88320c9dd611604352c7622a34255df23ca241659b07db6b"} Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.895168 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c07a2c08823474a88320c9dd611604352c7622a34255df23ca241659b07db6b" Mar 20 17:48:14 crc kubenswrapper[4795]: I0320 17:48:14.895270 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.005115 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9"] Mar 20 17:48:15 crc kubenswrapper[4795]: E0320 17:48:15.005678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerName="oc" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.005798 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerName="oc" Mar 20 17:48:15 crc kubenswrapper[4795]: E0320 17:48:15.005830 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.005845 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.006121 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" containerName="oc" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.006176 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0af5324-4ba3-4a12-9fdb-b467918ba19d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.007103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.009413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.011574 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.011678 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.012794 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.017592 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9"] Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.062936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.062994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.063027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.165189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.165262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.165301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.170185 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.177195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.194239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hprm9\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.334600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:48:15 crc kubenswrapper[4795]: I0320 17:48:15.917069 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9"] Mar 20 17:48:16 crc kubenswrapper[4795]: I0320 17:48:16.917338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerStarted","Data":"3ffae966dde36b17c8819bbe3306ed7854ac13b87b1dac795be893a2b620fea2"} Mar 20 17:48:16 crc kubenswrapper[4795]: I0320 17:48:16.917707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerStarted","Data":"4221084fbf63010ad9d50198b40243bcd607b968b80e4bc2aa8911a43dc05dbe"} Mar 20 17:48:16 crc kubenswrapper[4795]: I0320 17:48:16.943400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" podStartSLOduration=2.488752527 podStartE2EDuration="2.943369474s" podCreationTimestamp="2026-03-20 17:48:14 +0000 UTC" firstStartedPulling="2026-03-20 17:48:15.918451829 +0000 UTC m=+1839.376483370" lastFinishedPulling="2026-03-20 17:48:16.373068736 +0000 UTC m=+1839.831100317" observedRunningTime="2026-03-20 17:48:16.934402255 +0000 UTC m=+1840.392433826" watchObservedRunningTime="2026-03-20 17:48:16.943369474 +0000 UTC m=+1840.401401055" Mar 20 17:48:17 crc kubenswrapper[4795]: I0320 17:48:17.266174 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:17 crc kubenswrapper[4795]: E0320 17:48:17.266804 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:27 crc kubenswrapper[4795]: I0320 17:48:27.050539 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:48:27 crc kubenswrapper[4795]: I0320 17:48:27.060374 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qfpzw"] Mar 20 17:48:27 crc kubenswrapper[4795]: I0320 17:48:27.271948 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4244f6d6-536a-4555-a05b-176d696d427d" path="/var/lib/kubelet/pods/4244f6d6-536a-4555-a05b-176d696d427d/volumes" Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.049472 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.066200 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.077808 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7mx5b"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.087000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nfr5n"] Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.266570 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37537245-d57e-4087-ade6-6c028eb4d137" path="/var/lib/kubelet/pods/37537245-d57e-4087-ade6-6c028eb4d137/volumes" Mar 20 17:48:29 crc kubenswrapper[4795]: I0320 17:48:29.267193 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78238b29-6bdd-4f77-847e-731c6c785ed9" path="/var/lib/kubelet/pods/78238b29-6bdd-4f77-847e-731c6c785ed9/volumes" Mar 20 17:48:31 crc kubenswrapper[4795]: I0320 17:48:31.252858 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:31 crc kubenswrapper[4795]: E0320 17:48:31.254000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.035890 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.043297 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.052160 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rdxps"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.059622 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-4t68k"] Mar 20 17:48:42 crc kubenswrapper[4795]: I0320 17:48:42.253151 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:42 crc kubenswrapper[4795]: E0320 17:48:42.253681 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:48:43 crc kubenswrapper[4795]: I0320 17:48:43.271875 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706c47a0-7763-44af-9b14-0e5322a8f2f1" path="/var/lib/kubelet/pods/706c47a0-7763-44af-9b14-0e5322a8f2f1/volumes" Mar 20 17:48:43 crc kubenswrapper[4795]: I0320 17:48:43.273177 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d254abd5-b344-416a-b99d-96737388795e" path="/var/lib/kubelet/pods/d254abd5-b344-416a-b99d-96737388795e/volumes" Mar 20 17:48:52 crc kubenswrapper[4795]: I0320 17:48:52.976420 4795 scope.go:117] "RemoveContainer" containerID="43011a486c98482642b4a5dbe9079dc55e5de2d50808977b7d9c6649a885404a" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.040311 4795 scope.go:117] "RemoveContainer" containerID="8e4952423fe886bac972193165a7d0b5d846db9f137b7cbf7c828182ef389d13" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.118201 4795 scope.go:117] "RemoveContainer" containerID="ca6e296f2643b1e5c67cd7c021c2bf95d4bbdd0b4c6082814566acaf425b562b" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.169207 4795 scope.go:117] "RemoveContainer" containerID="8f1173f5ebccc23013501b8ad9c477f608df64a752414481775b8bf5160525e1" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.216671 4795 scope.go:117] "RemoveContainer" containerID="20458e6912d6d217fb3ae1b5fc987499c631ca087920807a0e981310469342cf" Mar 20 17:48:53 crc kubenswrapper[4795]: I0320 17:48:53.271092 4795 scope.go:117] "RemoveContainer" containerID="b272744c883e204ac3c7a8e8c3e62d9d484f7c330b2609300488f12a64494d78" Mar 20 17:48:54 crc kubenswrapper[4795]: I0320 17:48:54.252524 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:48:54 crc kubenswrapper[4795]: E0320 17:48:54.253052 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:07 crc kubenswrapper[4795]: I0320 17:49:07.263457 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:07 crc kubenswrapper[4795]: E0320 17:49:07.264556 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:20 crc kubenswrapper[4795]: I0320 17:49:20.252391 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:20 crc kubenswrapper[4795]: E0320 17:49:20.253112 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.080312 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.094837 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.104026 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.113838 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fqwkd"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.124994 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7051-account-create-update-d2d7p"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.131756 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4l89c"] Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.289948 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1dfe60-98b0-4644-b063-831293f9bd5c" path="/var/lib/kubelet/pods/7d1dfe60-98b0-4644-b063-831293f9bd5c/volumes" Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.291293 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42b654e-f003-45dd-a7c4-07655514643e" path="/var/lib/kubelet/pods/e42b654e-f003-45dd-a7c4-07655514643e/volumes" Mar 20 17:49:21 crc kubenswrapper[4795]: I0320 17:49:21.292128 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc90399-0b15-4fc6-b441-d7df6925c8aa" path="/var/lib/kubelet/pods/efc90399-0b15-4fc6-b441-d7df6925c8aa/volumes" Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.045978 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.065642 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.076835 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.086263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-100d-account-create-update-7l925"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.094293 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3748-account-create-update-j2khv"] Mar 20 17:49:22 crc kubenswrapper[4795]: I0320 17:49:22.105321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wd5n7"] Mar 20 17:49:23 crc kubenswrapper[4795]: I0320 17:49:23.271389 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a51797-b6d0-4b5b-9927-54d4b965469e" path="/var/lib/kubelet/pods/65a51797-b6d0-4b5b-9927-54d4b965469e/volumes" Mar 20 17:49:23 crc kubenswrapper[4795]: I0320 17:49:23.272881 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd6f80f-7908-42b5-b32a-63d585bd9194" path="/var/lib/kubelet/pods/7bd6f80f-7908-42b5-b32a-63d585bd9194/volumes" Mar 20 17:49:23 crc kubenswrapper[4795]: I0320 17:49:23.274507 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5daae9-920d-496a-ad6a-c016cfb82250" path="/var/lib/kubelet/pods/9f5daae9-920d-496a-ad6a-c016cfb82250/volumes" Mar 20 17:49:34 crc kubenswrapper[4795]: I0320 17:49:34.746990 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerID="3ffae966dde36b17c8819bbe3306ed7854ac13b87b1dac795be893a2b620fea2" exitCode=0 Mar 20 17:49:34 crc kubenswrapper[4795]: I0320 17:49:34.747053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerDied","Data":"3ffae966dde36b17c8819bbe3306ed7854ac13b87b1dac795be893a2b620fea2"} Mar 20 17:49:35 crc kubenswrapper[4795]: I0320 17:49:35.253312 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:35 crc kubenswrapper[4795]: E0320 17:49:35.253793 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.291787 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.456581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") pod \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.456853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") pod \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.456997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") pod \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\" (UID: \"2bad20c9-d77a-4c30-8fa2-979c05697cf4\") " Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.470875 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr" (OuterVolumeSpecName: "kube-api-access-545pr") pod "2bad20c9-d77a-4c30-8fa2-979c05697cf4" (UID: "2bad20c9-d77a-4c30-8fa2-979c05697cf4"). InnerVolumeSpecName "kube-api-access-545pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.487714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory" (OuterVolumeSpecName: "inventory") pod "2bad20c9-d77a-4c30-8fa2-979c05697cf4" (UID: "2bad20c9-d77a-4c30-8fa2-979c05697cf4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.494462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2bad20c9-d77a-4c30-8fa2-979c05697cf4" (UID: "2bad20c9-d77a-4c30-8fa2-979c05697cf4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.560881 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.560945 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-545pr\" (UniqueName: \"kubernetes.io/projected/2bad20c9-d77a-4c30-8fa2-979c05697cf4-kube-api-access-545pr\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.560965 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bad20c9-d77a-4c30-8fa2-979c05697cf4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.775633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" event={"ID":"2bad20c9-d77a-4c30-8fa2-979c05697cf4","Type":"ContainerDied","Data":"4221084fbf63010ad9d50198b40243bcd607b968b80e4bc2aa8911a43dc05dbe"} Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.775725 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4221084fbf63010ad9d50198b40243bcd607b968b80e4bc2aa8911a43dc05dbe" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.775756 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hprm9" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.955968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5"] Mar 20 17:49:36 crc kubenswrapper[4795]: E0320 17:49:36.956567 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.956599 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.957073 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bad20c9-d77a-4c30-8fa2-979c05697cf4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.959050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.964241 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.964843 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.964851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.965053 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.978534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5"] Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.984900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.984981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:36 crc kubenswrapper[4795]: I0320 17:49:36.985130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.087757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.088096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.088172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.093571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.094362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.115867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.307971 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.316574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.698715 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.712884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5"] Mar 20 17:49:37 crc kubenswrapper[4795]: I0320 17:49:37.788413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerStarted","Data":"514ef5f301898c3756907819ae9e98ff22f8aada603fcffcf7334f351de655ef"} Mar 20 17:49:38 crc kubenswrapper[4795]: I0320 17:49:38.125879 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:38 crc kubenswrapper[4795]: I0320 17:49:38.805617 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerStarted","Data":"8e9a482ad1e6bf23da2ae93a35021c118d8ae4b279a7e4f08b403502cf6ed589"} Mar 20 17:49:38 crc kubenswrapper[4795]: I0320 17:49:38.833481 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" podStartSLOduration=2.411258512 podStartE2EDuration="2.833461452s" podCreationTimestamp="2026-03-20 17:49:36 +0000 UTC" firstStartedPulling="2026-03-20 17:49:37.698401847 +0000 UTC m=+1921.156433378" lastFinishedPulling="2026-03-20 17:49:38.120604747 +0000 UTC m=+1921.578636318" observedRunningTime="2026-03-20 17:49:38.827230228 +0000 UTC m=+1922.285261769" watchObservedRunningTime="2026-03-20 17:49:38.833461452 +0000 UTC m=+1922.291493003" Mar 20 17:49:43 crc kubenswrapper[4795]: I0320 17:49:43.859538 4795 generic.go:334] "Generic (PLEG): container finished" podID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerID="8e9a482ad1e6bf23da2ae93a35021c118d8ae4b279a7e4f08b403502cf6ed589" exitCode=0 Mar 20 17:49:43 crc kubenswrapper[4795]: I0320 17:49:43.859592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerDied","Data":"8e9a482ad1e6bf23da2ae93a35021c118d8ae4b279a7e4f08b403502cf6ed589"} Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.358427 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.556796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") pod \"35b4aa82-d668-474b-b54d-b540190f5a6c\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.557029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") pod \"35b4aa82-d668-474b-b54d-b540190f5a6c\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.557070 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") pod \"35b4aa82-d668-474b-b54d-b540190f5a6c\" (UID: \"35b4aa82-d668-474b-b54d-b540190f5a6c\") " Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.563137 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb" (OuterVolumeSpecName: "kube-api-access-xzgcb") pod "35b4aa82-d668-474b-b54d-b540190f5a6c" (UID: "35b4aa82-d668-474b-b54d-b540190f5a6c"). InnerVolumeSpecName "kube-api-access-xzgcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.587664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory" (OuterVolumeSpecName: "inventory") pod "35b4aa82-d668-474b-b54d-b540190f5a6c" (UID: "35b4aa82-d668-474b-b54d-b540190f5a6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.606607 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35b4aa82-d668-474b-b54d-b540190f5a6c" (UID: "35b4aa82-d668-474b-b54d-b540190f5a6c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.660205 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.660258 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzgcb\" (UniqueName: \"kubernetes.io/projected/35b4aa82-d668-474b-b54d-b540190f5a6c-kube-api-access-xzgcb\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.660283 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35b4aa82-d668-474b-b54d-b540190f5a6c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.884091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" event={"ID":"35b4aa82-d668-474b-b54d-b540190f5a6c","Type":"ContainerDied","Data":"514ef5f301898c3756907819ae9e98ff22f8aada603fcffcf7334f351de655ef"} Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.884125 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514ef5f301898c3756907819ae9e98ff22f8aada603fcffcf7334f351de655ef" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.884188 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.988697 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55"] Mar 20 17:49:45 crc kubenswrapper[4795]: E0320 17:49:45.989245 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.989260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.989457 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35b4aa82-d668-474b-b54d-b540190f5a6c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.990018 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.993842 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.995585 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.996032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:49:45 crc kubenswrapper[4795]: I0320 17:49:45.997543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.009488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55"] Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.168416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.168626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.168914 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.253004 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:46 crc kubenswrapper[4795]: E0320 17:49:46.253425 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.271514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.271914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.272722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.276316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.276991 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.293041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5rj55\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.327483 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:49:46 crc kubenswrapper[4795]: I0320 17:49:46.918550 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55"] Mar 20 17:49:46 crc kubenswrapper[4795]: W0320 17:49:46.925359 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20b330a0_830c_419e_81fe_a36dd1a32cc2.slice/crio-ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110 WatchSource:0}: Error finding container ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110: Status 404 returned error can't find the container with id ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110 Mar 20 17:49:47 crc kubenswrapper[4795]: I0320 17:49:47.906271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerStarted","Data":"902ad09d05a718a4dc819827f582612e00fd8e9ea310fb5d70c495ae4bd899ea"} Mar 20 17:49:47 crc kubenswrapper[4795]: I0320 17:49:47.906756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerStarted","Data":"ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110"} Mar 20 17:49:47 crc kubenswrapper[4795]: I0320 17:49:47.939020 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" podStartSLOduration=2.555337413 podStartE2EDuration="2.938995435s" podCreationTimestamp="2026-03-20 17:49:45 +0000 UTC" firstStartedPulling="2026-03-20 17:49:46.928642372 +0000 UTC m=+1930.386673923" lastFinishedPulling="2026-03-20 17:49:47.312300414 +0000 UTC m=+1930.770331945" observedRunningTime="2026-03-20 17:49:47.931063438 +0000 UTC m=+1931.389095019" watchObservedRunningTime="2026-03-20 17:49:47.938995435 +0000 UTC m=+1931.397027016" Mar 20 17:49:51 crc kubenswrapper[4795]: I0320 17:49:51.070257 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:49:51 crc kubenswrapper[4795]: I0320 17:49:51.076718 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jfhz8"] Mar 20 17:49:51 crc kubenswrapper[4795]: I0320 17:49:51.268875 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a8b32b-fab3-401f-b667-592c8840bd97" path="/var/lib/kubelet/pods/02a8b32b-fab3-401f-b667-592c8840bd97/volumes" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.437648 4795 scope.go:117] "RemoveContainer" containerID="181a6c95c401e76e8326dba0d2e07f193da50d6dbe0b9151509c36ea5ad10c3e" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.500586 4795 scope.go:117] "RemoveContainer" containerID="1a5ed57a211fe9b0c1882f91516bfc8da29711316e391c1c87ed18df2cb6cc36" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.542832 4795 scope.go:117] "RemoveContainer" containerID="f4b5007e4a1309d08572b5c31e2719d5a9d1e8abc1f29797304920c21729de14" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.595282 4795 scope.go:117] "RemoveContainer" containerID="21aac3ceb6dfd938908085675b810b1f95e9aaa0d7afc715430454788951ca0a" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.640026 4795 scope.go:117] "RemoveContainer" containerID="d5e2a993de0ac2a73d513cdc5305eaa4c6be7243356c29c7534462e994b17675" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.670086 4795 scope.go:117] "RemoveContainer" containerID="b63d10a82b890eac4b6bd4726e08b48ae844a0390be4307858d97c75a41d914f" Mar 20 17:49:53 crc kubenswrapper[4795]: I0320 17:49:53.709645 4795 scope.go:117] "RemoveContainer" containerID="43233e70951461edfae55a0a1e96e29418696077a56a7e8b60e307ca9af5a951" Mar 20 17:49:59 crc kubenswrapper[4795]: I0320 17:49:59.255487 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:49:59 crc kubenswrapper[4795]: E0320 17:49:59.256835 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.171833 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.188480 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.188609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.192655 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.192929 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.192665 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.351453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"auto-csr-approver-29567150-glzlx\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.454179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"auto-csr-approver-29567150-glzlx\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.478659 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"auto-csr-approver-29567150-glzlx\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:00 crc kubenswrapper[4795]: I0320 17:50:00.533405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:01 crc kubenswrapper[4795]: W0320 17:50:01.048162 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e628e06_ee48_4969_824e_fba400b67d3a.slice/crio-8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2 WatchSource:0}: Error finding container 8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2: Status 404 returned error can't find the container with id 8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2 Mar 20 17:50:01 crc kubenswrapper[4795]: I0320 17:50:01.063024 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:50:02 crc kubenswrapper[4795]: I0320 17:50:02.062940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-glzlx" event={"ID":"5e628e06-ee48-4969-824e-fba400b67d3a","Type":"ContainerStarted","Data":"8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2"} Mar 20 17:50:03 crc kubenswrapper[4795]: I0320 17:50:03.071573 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e628e06-ee48-4969-824e-fba400b67d3a" containerID="b0e8f1ce702c9e1cfb11740285e904a1e8d1f711ef3e97850efbb6236da59523" exitCode=0 Mar 20 17:50:03 crc kubenswrapper[4795]: I0320 17:50:03.073464 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-glzlx" event={"ID":"5e628e06-ee48-4969-824e-fba400b67d3a","Type":"ContainerDied","Data":"b0e8f1ce702c9e1cfb11740285e904a1e8d1f711ef3e97850efbb6236da59523"} Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.492905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.668758 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") pod \"5e628e06-ee48-4969-824e-fba400b67d3a\" (UID: \"5e628e06-ee48-4969-824e-fba400b67d3a\") " Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.676962 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc" (OuterVolumeSpecName: "kube-api-access-t4bzc") pod "5e628e06-ee48-4969-824e-fba400b67d3a" (UID: "5e628e06-ee48-4969-824e-fba400b67d3a"). InnerVolumeSpecName "kube-api-access-t4bzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:04 crc kubenswrapper[4795]: I0320 17:50:04.771325 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4bzc\" (UniqueName: \"kubernetes.io/projected/5e628e06-ee48-4969-824e-fba400b67d3a-kube-api-access-t4bzc\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.102037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-glzlx" event={"ID":"5e628e06-ee48-4969-824e-fba400b67d3a","Type":"ContainerDied","Data":"8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2"} Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.102109 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8278317db8ace0b2ea0dda5bbccab7fbb1c97bc5e3956f8ce5c06eb7901382f2" Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.102785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-glzlx" Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.563983 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:50:05 crc kubenswrapper[4795]: I0320 17:50:05.570495 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-khl6t"] Mar 20 17:50:07 crc kubenswrapper[4795]: I0320 17:50:07.281543 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6396cd8-bc19-4f24-ae36-12356bfa8133" path="/var/lib/kubelet/pods/a6396cd8-bc19-4f24-ae36-12356bfa8133/volumes" Mar 20 17:50:12 crc kubenswrapper[4795]: I0320 17:50:12.048990 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:50:12 crc kubenswrapper[4795]: I0320 17:50:12.062764 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kmgk9"] Mar 20 17:50:12 crc kubenswrapper[4795]: I0320 17:50:12.252512 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:12 crc kubenswrapper[4795]: E0320 17:50:12.252805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:13 crc kubenswrapper[4795]: I0320 17:50:13.292195 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18794d5c-e43a-44dc-9510-763a31275104" path="/var/lib/kubelet/pods/18794d5c-e43a-44dc-9510-763a31275104/volumes" Mar 20 17:50:14 crc kubenswrapper[4795]: I0320 17:50:14.045378 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:50:14 crc kubenswrapper[4795]: I0320 17:50:14.057392 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcxtg"] Mar 20 17:50:15 crc kubenswrapper[4795]: I0320 17:50:15.273466 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f61db3a-a7de-495d-8305-b9e2910415e2" path="/var/lib/kubelet/pods/4f61db3a-a7de-495d-8305-b9e2910415e2/volumes" Mar 20 17:50:24 crc kubenswrapper[4795]: I0320 17:50:24.252847 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:24 crc kubenswrapper[4795]: E0320 17:50:24.253723 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:36 crc kubenswrapper[4795]: I0320 17:50:36.478675 4795 generic.go:334] "Generic (PLEG): container finished" podID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerID="902ad09d05a718a4dc819827f582612e00fd8e9ea310fb5d70c495ae4bd899ea" exitCode=0 Mar 20 17:50:36 crc kubenswrapper[4795]: I0320 17:50:36.478720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerDied","Data":"902ad09d05a718a4dc819827f582612e00fd8e9ea310fb5d70c495ae4bd899ea"} Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.032983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.182461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") pod \"20b330a0-830c-419e-81fe-a36dd1a32cc2\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.182616 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") pod \"20b330a0-830c-419e-81fe-a36dd1a32cc2\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.182793 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") pod \"20b330a0-830c-419e-81fe-a36dd1a32cc2\" (UID: \"20b330a0-830c-419e-81fe-a36dd1a32cc2\") " Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.191791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76" (OuterVolumeSpecName: "kube-api-access-8gt76") pod "20b330a0-830c-419e-81fe-a36dd1a32cc2" (UID: "20b330a0-830c-419e-81fe-a36dd1a32cc2"). InnerVolumeSpecName "kube-api-access-8gt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.248145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory" (OuterVolumeSpecName: "inventory") pod "20b330a0-830c-419e-81fe-a36dd1a32cc2" (UID: "20b330a0-830c-419e-81fe-a36dd1a32cc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.248843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20b330a0-830c-419e-81fe-a36dd1a32cc2" (UID: "20b330a0-830c-419e-81fe-a36dd1a32cc2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.285456 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.285494 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gt76\" (UniqueName: \"kubernetes.io/projected/20b330a0-830c-419e-81fe-a36dd1a32cc2-kube-api-access-8gt76\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.285508 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20b330a0-830c-419e-81fe-a36dd1a32cc2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.506236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" event={"ID":"20b330a0-830c-419e-81fe-a36dd1a32cc2","Type":"ContainerDied","Data":"ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110"} Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.506281 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee8e83601ab43e68e510cafb3b4ff82d935f3576672f9eef4242277b96bbc110" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.506298 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5rj55" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708092 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm"] Mar 20 17:50:38 crc kubenswrapper[4795]: E0320 17:50:38.708565 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" containerName="oc" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708582 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" containerName="oc" Mar 20 17:50:38 crc kubenswrapper[4795]: E0320 17:50:38.708594 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708601 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708880 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b330a0-830c-419e-81fe-a36dd1a32cc2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.708896 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" containerName="oc" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.709447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.711213 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.711425 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.711737 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.713550 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.719893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm"] Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.897763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.897832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:38 crc kubenswrapper[4795]: I0320 17:50:38.897997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.000416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.000488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.000586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.004167 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.011480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.021802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.025812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.254464 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:39 crc kubenswrapper[4795]: E0320 17:50:39.255574 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:50:39 crc kubenswrapper[4795]: I0320 17:50:39.647009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm"] Mar 20 17:50:39 crc kubenswrapper[4795]: W0320 17:50:39.654134 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d666090_1065_4b2d_9ac6_b84776b53d0a.slice/crio-87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39 WatchSource:0}: Error finding container 87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39: Status 404 returned error can't find the container with id 87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39 Mar 20 17:50:40 crc kubenswrapper[4795]: I0320 17:50:40.530324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerStarted","Data":"894408f7a6f3e593d4f36352df00c09efefbfa26446049a6cd6ff9f1cbfa90a4"} Mar 20 17:50:40 crc kubenswrapper[4795]: I0320 17:50:40.530648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerStarted","Data":"87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39"} Mar 20 17:50:40 crc kubenswrapper[4795]: I0320 17:50:40.557076 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" podStartSLOduration=2.034467904 podStartE2EDuration="2.55704474s" podCreationTimestamp="2026-03-20 17:50:38 +0000 UTC" firstStartedPulling="2026-03-20 17:50:39.656740474 +0000 UTC m=+1983.114772025" lastFinishedPulling="2026-03-20 17:50:40.17931728 +0000 UTC m=+1983.637348861" observedRunningTime="2026-03-20 17:50:40.552637433 +0000 UTC m=+1984.010669004" watchObservedRunningTime="2026-03-20 17:50:40.55704474 +0000 UTC m=+1984.015076321" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.252816 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.673369 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5"} Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.884309 4795 scope.go:117] "RemoveContainer" containerID="2bb981684096c6d7989fb2cb73e5f71d3f241740ae3f49c070189810ce7e7bb1" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.950909 4795 scope.go:117] "RemoveContainer" containerID="de6cb1e775438df313c59586663c35fb681b66b389159fc9df68bc69d850ac1c" Mar 20 17:50:53 crc kubenswrapper[4795]: I0320 17:50:53.985492 4795 scope.go:117] "RemoveContainer" containerID="29087c37b0e22594df358a498bb26205f2050bb1e4a607372b3a2ba3b4df8dd7" Mar 20 17:50:59 crc kubenswrapper[4795]: I0320 17:50:59.067113 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:50:59 crc kubenswrapper[4795]: I0320 17:50:59.082342 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-67csj"] Mar 20 17:50:59 crc kubenswrapper[4795]: I0320 17:50:59.271781 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc8602c-1f19-4825-b3e5-32d643f12430" path="/var/lib/kubelet/pods/fbc8602c-1f19-4825-b3e5-32d643f12430/volumes" Mar 20 17:51:33 crc kubenswrapper[4795]: I0320 17:51:33.117490 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerID="894408f7a6f3e593d4f36352df00c09efefbfa26446049a6cd6ff9f1cbfa90a4" exitCode=0 Mar 20 17:51:33 crc kubenswrapper[4795]: I0320 17:51:33.117594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerDied","Data":"894408f7a6f3e593d4f36352df00c09efefbfa26446049a6cd6ff9f1cbfa90a4"} Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.597789 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.760245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") pod \"3d666090-1065-4b2d-9ac6-b84776b53d0a\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.760639 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") pod \"3d666090-1065-4b2d-9ac6-b84776b53d0a\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.760777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") pod \"3d666090-1065-4b2d-9ac6-b84776b53d0a\" (UID: \"3d666090-1065-4b2d-9ac6-b84776b53d0a\") " Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.768038 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz" (OuterVolumeSpecName: "kube-api-access-mwftz") pod "3d666090-1065-4b2d-9ac6-b84776b53d0a" (UID: "3d666090-1065-4b2d-9ac6-b84776b53d0a"). InnerVolumeSpecName "kube-api-access-mwftz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.813487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d666090-1065-4b2d-9ac6-b84776b53d0a" (UID: "3d666090-1065-4b2d-9ac6-b84776b53d0a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.816095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory" (OuterVolumeSpecName: "inventory") pod "3d666090-1065-4b2d-9ac6-b84776b53d0a" (UID: "3d666090-1065-4b2d-9ac6-b84776b53d0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.864521 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwftz\" (UniqueName: \"kubernetes.io/projected/3d666090-1065-4b2d-9ac6-b84776b53d0a-kube-api-access-mwftz\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.864598 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:34 crc kubenswrapper[4795]: I0320 17:51:34.864627 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d666090-1065-4b2d-9ac6-b84776b53d0a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.148642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" event={"ID":"3d666090-1065-4b2d-9ac6-b84776b53d0a","Type":"ContainerDied","Data":"87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39"} Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.148747 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87be9490f3043ab76061904609d5c52b56fd0e2c7bb349615847b8323282db39" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.148842 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.434369 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j6rls"] Mar 20 17:51:35 crc kubenswrapper[4795]: E0320 17:51:35.435215 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.435258 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.435631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d666090-1065-4b2d-9ac6-b84776b53d0a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.436792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.440670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.441784 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.443871 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.444102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.453659 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j6rls"] Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.582734 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.582849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.583016 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.684870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.685134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.685232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.693606 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.699452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.715371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"ssh-known-hosts-edpm-deployment-j6rls\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:35 crc kubenswrapper[4795]: I0320 17:51:35.766233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:36 crc kubenswrapper[4795]: I0320 17:51:36.354432 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-j6rls"] Mar 20 17:51:37 crc kubenswrapper[4795]: I0320 17:51:37.172530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerStarted","Data":"a299166917e0a539fa06dd999e0055403dbb7b59d130347d0ae95dd5287f90cf"} Mar 20 17:51:37 crc kubenswrapper[4795]: I0320 17:51:37.172876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerStarted","Data":"02964fff73e3665f58b601e9cc6da3da341c2ab0c31c859cfed6fac2de7b5310"} Mar 20 17:51:37 crc kubenswrapper[4795]: I0320 17:51:37.212930 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" podStartSLOduration=1.75360969 podStartE2EDuration="2.212903578s" podCreationTimestamp="2026-03-20 17:51:35 +0000 UTC" firstStartedPulling="2026-03-20 17:51:36.365134188 +0000 UTC m=+2039.823165729" lastFinishedPulling="2026-03-20 17:51:36.824428036 +0000 UTC m=+2040.282459617" observedRunningTime="2026-03-20 17:51:37.197423027 +0000 UTC m=+2040.655454578" watchObservedRunningTime="2026-03-20 17:51:37.212903578 +0000 UTC m=+2040.670935149" Mar 20 17:51:44 crc kubenswrapper[4795]: I0320 17:51:44.249316 4795 generic.go:334] "Generic (PLEG): container finished" podID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerID="a299166917e0a539fa06dd999e0055403dbb7b59d130347d0ae95dd5287f90cf" exitCode=0 Mar 20 17:51:44 crc kubenswrapper[4795]: I0320 17:51:44.249486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerDied","Data":"a299166917e0a539fa06dd999e0055403dbb7b59d130347d0ae95dd5287f90cf"} Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.698524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.745526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") pod \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.745576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") pod \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.745753 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") pod \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\" (UID: \"80cf5a83-936d-4789-a7bc-b91cdb0e564d\") " Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.754971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb" (OuterVolumeSpecName: "kube-api-access-sjzxb") pod "80cf5a83-936d-4789-a7bc-b91cdb0e564d" (UID: "80cf5a83-936d-4789-a7bc-b91cdb0e564d"). InnerVolumeSpecName "kube-api-access-sjzxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.783132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "80cf5a83-936d-4789-a7bc-b91cdb0e564d" (UID: "80cf5a83-936d-4789-a7bc-b91cdb0e564d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.799365 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "80cf5a83-936d-4789-a7bc-b91cdb0e564d" (UID: "80cf5a83-936d-4789-a7bc-b91cdb0e564d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.846767 4795 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.847009 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/80cf5a83-936d-4789-a7bc-b91cdb0e564d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:45 crc kubenswrapper[4795]: I0320 17:51:45.847100 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzxb\" (UniqueName: \"kubernetes.io/projected/80cf5a83-936d-4789-a7bc-b91cdb0e564d-kube-api-access-sjzxb\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.277339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" event={"ID":"80cf5a83-936d-4789-a7bc-b91cdb0e564d","Type":"ContainerDied","Data":"02964fff73e3665f58b601e9cc6da3da341c2ab0c31c859cfed6fac2de7b5310"} Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.277579 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02964fff73e3665f58b601e9cc6da3da341c2ab0c31c859cfed6fac2de7b5310" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.277483 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-j6rls" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.430104 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6"] Mar 20 17:51:46 crc kubenswrapper[4795]: E0320 17:51:46.430515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.430533 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.430798 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="80cf5a83-936d-4789-a7bc-b91cdb0e564d" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.431507 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.446854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6"] Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469067 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469081 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.469451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.563397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.564230 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.564559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.667271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.667614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.668494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.673040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.673424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.688342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-prfq6\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:46 crc kubenswrapper[4795]: I0320 17:51:46.783495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:47 crc kubenswrapper[4795]: I0320 17:51:47.354505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6"] Mar 20 17:51:47 crc kubenswrapper[4795]: W0320 17:51:47.358195 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdb4943_60a1_41cc_aead_1702a4c1f68a.slice/crio-1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0 WatchSource:0}: Error finding container 1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0: Status 404 returned error can't find the container with id 1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0 Mar 20 17:51:48 crc kubenswrapper[4795]: I0320 17:51:48.309569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerStarted","Data":"6ddabdaea64d113fb6013d8724623c1c666cb6ccb8ea39aaa1327bb1e88278fe"} Mar 20 17:51:48 crc kubenswrapper[4795]: I0320 17:51:48.311219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerStarted","Data":"1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0"} Mar 20 17:51:48 crc kubenswrapper[4795]: I0320 17:51:48.341326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" podStartSLOduration=1.8062972259999999 podStartE2EDuration="2.34130195s" podCreationTimestamp="2026-03-20 17:51:46 +0000 UTC" firstStartedPulling="2026-03-20 17:51:47.362266895 +0000 UTC m=+2050.820298466" lastFinishedPulling="2026-03-20 17:51:47.897271609 +0000 UTC m=+2051.355303190" observedRunningTime="2026-03-20 17:51:48.337983547 +0000 UTC m=+2051.796015158" watchObservedRunningTime="2026-03-20 17:51:48.34130195 +0000 UTC m=+2051.799333531" Mar 20 17:51:54 crc kubenswrapper[4795]: I0320 17:51:54.126237 4795 scope.go:117] "RemoveContainer" containerID="c3dbd02db17863581582e40f291eb346e5dca8aa3c7d277d71e53142232286eb" Mar 20 17:51:56 crc kubenswrapper[4795]: I0320 17:51:56.398745 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerID="6ddabdaea64d113fb6013d8724623c1c666cb6ccb8ea39aaa1327bb1e88278fe" exitCode=0 Mar 20 17:51:56 crc kubenswrapper[4795]: I0320 17:51:56.398830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerDied","Data":"6ddabdaea64d113fb6013d8724623c1c666cb6ccb8ea39aaa1327bb1e88278fe"} Mar 20 17:51:57 crc kubenswrapper[4795]: I0320 17:51:57.879335 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.024754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") pod \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.025113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") pod \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.025152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") pod \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\" (UID: \"9cdb4943-60a1-41cc-aead-1702a4c1f68a\") " Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.033914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk" (OuterVolumeSpecName: "kube-api-access-fvsgk") pod "9cdb4943-60a1-41cc-aead-1702a4c1f68a" (UID: "9cdb4943-60a1-41cc-aead-1702a4c1f68a"). InnerVolumeSpecName "kube-api-access-fvsgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.074161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory" (OuterVolumeSpecName: "inventory") pod "9cdb4943-60a1-41cc-aead-1702a4c1f68a" (UID: "9cdb4943-60a1-41cc-aead-1702a4c1f68a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.077570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9cdb4943-60a1-41cc-aead-1702a4c1f68a" (UID: "9cdb4943-60a1-41cc-aead-1702a4c1f68a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.128945 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.129021 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9cdb4943-60a1-41cc-aead-1702a4c1f68a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.129041 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvsgk\" (UniqueName: \"kubernetes.io/projected/9cdb4943-60a1-41cc-aead-1702a4c1f68a-kube-api-access-fvsgk\") on node \"crc\" DevicePath \"\"" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.430780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" event={"ID":"9cdb4943-60a1-41cc-aead-1702a4c1f68a","Type":"ContainerDied","Data":"1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0"} Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.430819 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fdb0e8e927ac13e2e95c01c859fc96b55cc4101b09f9ae90907828ce57888e0" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.430876 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-prfq6" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.528851 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88"] Mar 20 17:51:58 crc kubenswrapper[4795]: E0320 17:51:58.529436 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.529463 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.529759 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdb4943-60a1-41cc-aead-1702a4c1f68a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.530618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.533372 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.533449 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.533572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.534338 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.542102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.542290 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.542497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.552386 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88"] Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.643180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.643258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.643334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.647823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.653514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.666196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:58 crc kubenswrapper[4795]: I0320 17:51:58.849420 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:51:59 crc kubenswrapper[4795]: I0320 17:51:59.392349 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88"] Mar 20 17:51:59 crc kubenswrapper[4795]: I0320 17:51:59.440003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerStarted","Data":"5364cbab99f63192734fe793bfb234d47c253a170235e5b57235660ea2366376"} Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.142656 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.144432 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.171853 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.173724 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.172209 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.176760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"auto-csr-approver-29567152-ksnjt\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.186453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.279473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"auto-csr-approver-29567152-ksnjt\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.301102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"auto-csr-approver-29567152-ksnjt\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.451855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerStarted","Data":"01d757bb54c99abdbeddf07efb6aaf2657cdd1489743a59d6dc864872b8e779b"} Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.483280 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" podStartSLOduration=2.05557857 podStartE2EDuration="2.483243723s" podCreationTimestamp="2026-03-20 17:51:58 +0000 UTC" firstStartedPulling="2026-03-20 17:51:59.398901069 +0000 UTC m=+2062.856932620" lastFinishedPulling="2026-03-20 17:51:59.826566182 +0000 UTC m=+2063.284597773" observedRunningTime="2026-03-20 17:52:00.479758645 +0000 UTC m=+2063.937790206" watchObservedRunningTime="2026-03-20 17:52:00.483243723 +0000 UTC m=+2063.941275334" Mar 20 17:52:00 crc kubenswrapper[4795]: I0320 17:52:00.497782 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:01 crc kubenswrapper[4795]: W0320 17:52:01.013857 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8abf4de_a372_47df_b14c_490f1e084a56.slice/crio-5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5 WatchSource:0}: Error finding container 5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5: Status 404 returned error can't find the container with id 5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5 Mar 20 17:52:01 crc kubenswrapper[4795]: I0320 17:52:01.017616 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:52:01 crc kubenswrapper[4795]: I0320 17:52:01.461979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" event={"ID":"d8abf4de-a372-47df-b14c-490f1e084a56","Type":"ContainerStarted","Data":"5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5"} Mar 20 17:52:02 crc kubenswrapper[4795]: I0320 17:52:02.472960 4795 generic.go:334] "Generic (PLEG): container finished" podID="d8abf4de-a372-47df-b14c-490f1e084a56" containerID="fa7c9e74af14d50a1c364d101636ba64da237edd40eebd00160c638dba974672" exitCode=0 Mar 20 17:52:02 crc kubenswrapper[4795]: I0320 17:52:02.473036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" event={"ID":"d8abf4de-a372-47df-b14c-490f1e084a56","Type":"ContainerDied","Data":"fa7c9e74af14d50a1c364d101636ba64da237edd40eebd00160c638dba974672"} Mar 20 17:52:03 crc kubenswrapper[4795]: I0320 17:52:03.857099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:03 crc kubenswrapper[4795]: I0320 17:52:03.960053 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") pod \"d8abf4de-a372-47df-b14c-490f1e084a56\" (UID: \"d8abf4de-a372-47df-b14c-490f1e084a56\") " Mar 20 17:52:03 crc kubenswrapper[4795]: I0320 17:52:03.966266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd" (OuterVolumeSpecName: "kube-api-access-5xwmd") pod "d8abf4de-a372-47df-b14c-490f1e084a56" (UID: "d8abf4de-a372-47df-b14c-490f1e084a56"). InnerVolumeSpecName "kube-api-access-5xwmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.062363 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xwmd\" (UniqueName: \"kubernetes.io/projected/d8abf4de-a372-47df-b14c-490f1e084a56-kube-api-access-5xwmd\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.499995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" event={"ID":"d8abf4de-a372-47df-b14c-490f1e084a56","Type":"ContainerDied","Data":"5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5"} Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.500045 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac572b8ef022030929ebe0f5308b1ed22522264a24e5df91b3bf86ce790e3a5" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.500110 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-ksnjt" Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.956453 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:52:04 crc kubenswrapper[4795]: I0320 17:52:04.970048 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-xzfkq"] Mar 20 17:52:05 crc kubenswrapper[4795]: I0320 17:52:05.268493 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c" path="/var/lib/kubelet/pods/6ce1ddf5-f6e1-40ab-926d-4cf03d502e9c/volumes" Mar 20 17:52:10 crc kubenswrapper[4795]: I0320 17:52:10.564076 4795 generic.go:334] "Generic (PLEG): container finished" podID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerID="01d757bb54c99abdbeddf07efb6aaf2657cdd1489743a59d6dc864872b8e779b" exitCode=0 Mar 20 17:52:10 crc kubenswrapper[4795]: I0320 17:52:10.564186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerDied","Data":"01d757bb54c99abdbeddf07efb6aaf2657cdd1489743a59d6dc864872b8e779b"} Mar 20 17:52:11 crc kubenswrapper[4795]: I0320 17:52:11.935195 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.058654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") pod \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.058751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") pod \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.058972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") pod \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\" (UID: \"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4\") " Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.064371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp" (OuterVolumeSpecName: "kube-api-access-4bdvp") pod "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" (UID: "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4"). InnerVolumeSpecName "kube-api-access-4bdvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.091052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" (UID: "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.107531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory" (OuterVolumeSpecName: "inventory") pod "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" (UID: "1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.161110 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.161144 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.161158 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bdvp\" (UniqueName: \"kubernetes.io/projected/1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4-kube-api-access-4bdvp\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.587435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" event={"ID":"1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4","Type":"ContainerDied","Data":"5364cbab99f63192734fe793bfb234d47c253a170235e5b57235660ea2366376"} Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.587495 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5364cbab99f63192734fe793bfb234d47c253a170235e5b57235660ea2366376" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.587529 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707004 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5"] Mar 20 17:52:12 crc kubenswrapper[4795]: E0320 17:52:12.707389 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" containerName="oc" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707412 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" containerName="oc" Mar 20 17:52:12 crc kubenswrapper[4795]: E0320 17:52:12.707461 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707471 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707668 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" containerName="oc" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.707730 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.708427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.720484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.721033 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.722872 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723673 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.723916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.726890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.753992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5"] Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776556 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776592 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.776936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.777272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.877985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.878161 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.887484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.887475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.887675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.888075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.888388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.888950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.889029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.889325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.890053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.891331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.892477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.893304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.895582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:12 crc kubenswrapper[4795]: I0320 17:52:12.898007 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:13 crc kubenswrapper[4795]: I0320 17:52:13.039636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:13 crc kubenswrapper[4795]: I0320 17:52:13.610188 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5"] Mar 20 17:52:13 crc kubenswrapper[4795]: W0320 17:52:13.612624 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ab0ae7d_87ee_4e3f_a963_d126c5ddab8c.slice/crio-e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6 WatchSource:0}: Error finding container e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6: Status 404 returned error can't find the container with id e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6 Mar 20 17:52:14 crc kubenswrapper[4795]: I0320 17:52:14.619440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerStarted","Data":"fbddf6801bd4755280989348b7233b7faafa5ea394bfbc59e7cb23626aa16c9a"} Mar 20 17:52:14 crc kubenswrapper[4795]: I0320 17:52:14.620118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerStarted","Data":"e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6"} Mar 20 17:52:14 crc kubenswrapper[4795]: I0320 17:52:14.666727 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" podStartSLOduration=2.264384852 podStartE2EDuration="2.666708826s" podCreationTimestamp="2026-03-20 17:52:12 +0000 UTC" firstStartedPulling="2026-03-20 17:52:13.615265576 +0000 UTC m=+2077.073297127" lastFinishedPulling="2026-03-20 17:52:14.01758952 +0000 UTC m=+2077.475621101" observedRunningTime="2026-03-20 17:52:14.650622266 +0000 UTC m=+2078.108653827" watchObservedRunningTime="2026-03-20 17:52:14.666708826 +0000 UTC m=+2078.124740387" Mar 20 17:52:52 crc kubenswrapper[4795]: I0320 17:52:52.028544 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerID="fbddf6801bd4755280989348b7233b7faafa5ea394bfbc59e7cb23626aa16c9a" exitCode=0 Mar 20 17:52:52 crc kubenswrapper[4795]: I0320 17:52:52.028640 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerDied","Data":"fbddf6801bd4755280989348b7233b7faafa5ea394bfbc59e7cb23626aa16c9a"} Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.538299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724841 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.724901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.725327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") pod \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\" (UID: \"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c\") " Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.731982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.733122 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.733181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg" (OuterVolumeSpecName: "kube-api-access-brpkg") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "kube-api-access-brpkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.735615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.735624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.736459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.737004 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.737646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.737882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.740242 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.742961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.747880 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.786063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.786547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory" (OuterVolumeSpecName: "inventory") pod "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" (UID: "0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828297 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828340 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828357 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828371 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828384 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828396 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828414 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828426 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828437 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brpkg\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-kube-api-access-brpkg\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828449 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828460 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828471 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828483 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:53 crc kubenswrapper[4795]: I0320 17:52:53.828495 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.051340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" event={"ID":"0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c","Type":"ContainerDied","Data":"e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6"} Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.051399 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5df1eb5622378821a9d1751f25ad91d0875e58facbfefc8a6fd11f7dbf014e6" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.051503 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.223792 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45"] Mar 20 17:52:54 crc kubenswrapper[4795]: E0320 17:52:54.224480 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.224570 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.224821 4795 scope.go:117] "RemoveContainer" containerID="19365e3a16d7780a49439b00d5a850dc06a2d65e28a412bb9b05e779d9d4ec51" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.225049 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.226147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.231780 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232009 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232284 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232493 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.232637 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.238690 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45"] Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.347681 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348439 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.348653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.450882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.451005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.454254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.455004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.455207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.466159 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9jw45\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:54 crc kubenswrapper[4795]: I0320 17:52:54.592828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:52:55 crc kubenswrapper[4795]: I0320 17:52:55.624661 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45"] Mar 20 17:52:56 crc kubenswrapper[4795]: I0320 17:52:56.170649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerStarted","Data":"885fd0d01e15e8e73814661bd573c22dd2c0eba50edad204e7a8f73bdb2ca832"} Mar 20 17:52:57 crc kubenswrapper[4795]: I0320 17:52:57.181909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerStarted","Data":"ce3704459df378ee719ee4944060e7d6335ef9231a60476865a774b316275a14"} Mar 20 17:52:57 crc kubenswrapper[4795]: I0320 17:52:57.225009 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" podStartSLOduration=2.621942797 podStartE2EDuration="3.2249921s" podCreationTimestamp="2026-03-20 17:52:54 +0000 UTC" firstStartedPulling="2026-03-20 17:52:55.631551018 +0000 UTC m=+2119.089582569" lastFinishedPulling="2026-03-20 17:52:56.234600291 +0000 UTC m=+2119.692631872" observedRunningTime="2026-03-20 17:52:57.21728055 +0000 UTC m=+2120.675312091" watchObservedRunningTime="2026-03-20 17:52:57.2249921 +0000 UTC m=+2120.683023641" Mar 20 17:53:11 crc kubenswrapper[4795]: I0320 17:53:11.299789 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:53:11 crc kubenswrapper[4795]: I0320 17:53:11.300478 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:53:21 crc kubenswrapper[4795]: I0320 17:53:21.949344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:21 crc kubenswrapper[4795]: I0320 17:53:21.954801 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:21 crc kubenswrapper[4795]: I0320 17:53:21.966634 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.109643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.109830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.109913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.211560 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.211645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.211824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.212355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.212954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.237754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"redhat-operators-5z2tx\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.287391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:22 crc kubenswrapper[4795]: I0320 17:53:22.764754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:23 crc kubenswrapper[4795]: I0320 17:53:23.497039 4795 generic.go:334] "Generic (PLEG): container finished" podID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" exitCode=0 Mar 20 17:53:23 crc kubenswrapper[4795]: I0320 17:53:23.497083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872"} Mar 20 17:53:23 crc kubenswrapper[4795]: I0320 17:53:23.497301 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerStarted","Data":"98600df691122c949eb7ec07e25241678890358ec78c766404af0bcce9f83085"} Mar 20 17:53:24 crc kubenswrapper[4795]: I0320 17:53:24.512334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerStarted","Data":"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd"} Mar 20 17:53:27 crc kubenswrapper[4795]: I0320 17:53:27.542879 4795 generic.go:334] "Generic (PLEG): container finished" podID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" exitCode=0 Mar 20 17:53:27 crc kubenswrapper[4795]: I0320 17:53:27.542950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd"} Mar 20 17:53:29 crc kubenswrapper[4795]: I0320 17:53:29.559970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerStarted","Data":"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f"} Mar 20 17:53:29 crc kubenswrapper[4795]: I0320 17:53:29.580662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5z2tx" podStartSLOduration=3.6511068399999997 podStartE2EDuration="8.58064466s" podCreationTimestamp="2026-03-20 17:53:21 +0000 UTC" firstStartedPulling="2026-03-20 17:53:23.499112019 +0000 UTC m=+2146.957143560" lastFinishedPulling="2026-03-20 17:53:28.428649839 +0000 UTC m=+2151.886681380" observedRunningTime="2026-03-20 17:53:29.57875121 +0000 UTC m=+2153.036782751" watchObservedRunningTime="2026-03-20 17:53:29.58064466 +0000 UTC m=+2153.038676201" Mar 20 17:53:32 crc kubenswrapper[4795]: I0320 17:53:32.287719 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:32 crc kubenswrapper[4795]: I0320 17:53:32.287975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:33 crc kubenswrapper[4795]: I0320 17:53:33.386626 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5z2tx" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" probeResult="failure" output=< Mar 20 17:53:33 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 17:53:33 crc kubenswrapper[4795]: > Mar 20 17:53:41 crc kubenswrapper[4795]: I0320 17:53:41.300546 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:53:41 crc kubenswrapper[4795]: I0320 17:53:41.301181 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:53:42 crc kubenswrapper[4795]: I0320 17:53:42.334152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:42 crc kubenswrapper[4795]: I0320 17:53:42.393676 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:42 crc kubenswrapper[4795]: I0320 17:53:42.572633 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:43 crc kubenswrapper[4795]: I0320 17:53:43.686175 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5z2tx" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" containerID="cri-o://0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" gracePeriod=2 Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.105022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.245909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") pod \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.246825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") pod \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.247162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") pod \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\" (UID: \"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8\") " Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.247843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities" (OuterVolumeSpecName: "utilities") pod "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" (UID: "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.252020 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.253948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9" (OuterVolumeSpecName: "kube-api-access-4w8p9") pod "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" (UID: "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8"). InnerVolumeSpecName "kube-api-access-4w8p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.357508 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w8p9\" (UniqueName: \"kubernetes.io/projected/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-kube-api-access-4w8p9\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.437316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" (UID: "9547205b-b5cf-4d81-bbaa-ffa7b3de10b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.458679 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697062 4795 generic.go:334] "Generic (PLEG): container finished" podID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" exitCode=0 Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f"} Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697125 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z2tx" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z2tx" event={"ID":"9547205b-b5cf-4d81-bbaa-ffa7b3de10b8","Type":"ContainerDied","Data":"98600df691122c949eb7ec07e25241678890358ec78c766404af0bcce9f83085"} Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.697164 4795 scope.go:117] "RemoveContainer" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.715931 4795 scope.go:117] "RemoveContainer" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.729759 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.737154 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5z2tx"] Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.751361 4795 scope.go:117] "RemoveContainer" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.791777 4795 scope.go:117] "RemoveContainer" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" Mar 20 17:53:44 crc kubenswrapper[4795]: E0320 17:53:44.792322 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f\": container with ID starting with 0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f not found: ID does not exist" containerID="0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792360 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f"} err="failed to get container status \"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f\": rpc error: code = NotFound desc = could not find container \"0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f\": container with ID starting with 0481826998a41bb9853700dbd5b2d5fbc65ac9dfe93f63d75963f232fbc8198f not found: ID does not exist" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792390 4795 scope.go:117] "RemoveContainer" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" Mar 20 17:53:44 crc kubenswrapper[4795]: E0320 17:53:44.792846 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd\": container with ID starting with 4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd not found: ID does not exist" containerID="4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792866 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd"} err="failed to get container status \"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd\": rpc error: code = NotFound desc = could not find container \"4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd\": container with ID starting with 4e60b7be14555a7cffe6947fe1fae273d0c19279e9d03302018e796c5e123ddd not found: ID does not exist" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.792879 4795 scope.go:117] "RemoveContainer" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" Mar 20 17:53:44 crc kubenswrapper[4795]: E0320 17:53:44.793268 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872\": container with ID starting with f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872 not found: ID does not exist" containerID="f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872" Mar 20 17:53:44 crc kubenswrapper[4795]: I0320 17:53:44.793301 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872"} err="failed to get container status \"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872\": rpc error: code = NotFound desc = could not find container \"f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872\": container with ID starting with f1945d55294a386841e6d459cd148d23e6da1f566119d10e5e6f4969b6db7872 not found: ID does not exist" Mar 20 17:53:45 crc kubenswrapper[4795]: I0320 17:53:45.267638 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" path="/var/lib/kubelet/pods/9547205b-b5cf-4d81-bbaa-ffa7b3de10b8/volumes" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.443896 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:53:52 crc kubenswrapper[4795]: E0320 17:53:52.444793 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-content" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.444805 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-content" Mar 20 17:53:52 crc kubenswrapper[4795]: E0320 17:53:52.444816 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.444822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" Mar 20 17:53:52 crc kubenswrapper[4795]: E0320 17:53:52.444839 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-utilities" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.444846 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="extract-utilities" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.445028 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9547205b-b5cf-4d81-bbaa-ffa7b3de10b8" containerName="registry-server" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.446347 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.463629 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.557812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.557969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.557992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.659741 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.659789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.659844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.660267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.660342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.679191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"community-operators-t7p7j\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:52 crc kubenswrapper[4795]: I0320 17:53:52.779008 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.380343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.784376 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0450211-15da-4926-9f2c-f1169ac44b02" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" exitCode=0 Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.784475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9"} Mar 20 17:53:53 crc kubenswrapper[4795]: I0320 17:53:53.784666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerStarted","Data":"bed2375c0e8c4495026b6591fb4eeb0ebb046e32c6a24c250a89855bed8b766d"} Mar 20 17:53:55 crc kubenswrapper[4795]: I0320 17:53:55.959075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerStarted","Data":"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e"} Mar 20 17:53:56 crc kubenswrapper[4795]: I0320 17:53:56.988278 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0450211-15da-4926-9f2c-f1169ac44b02" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" exitCode=0 Mar 20 17:53:56 crc kubenswrapper[4795]: I0320 17:53:56.988330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e"} Mar 20 17:53:58 crc kubenswrapper[4795]: I0320 17:53:58.002890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerStarted","Data":"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377"} Mar 20 17:53:58 crc kubenswrapper[4795]: I0320 17:53:58.036196 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t7p7j" podStartSLOduration=2.354875688 podStartE2EDuration="6.036177902s" podCreationTimestamp="2026-03-20 17:53:52 +0000 UTC" firstStartedPulling="2026-03-20 17:53:53.787098184 +0000 UTC m=+2177.245129735" lastFinishedPulling="2026-03-20 17:53:57.468400408 +0000 UTC m=+2180.926431949" observedRunningTime="2026-03-20 17:53:58.027648257 +0000 UTC m=+2181.485679838" watchObservedRunningTime="2026-03-20 17:53:58.036177902 +0000 UTC m=+2181.494209453" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.151972 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.155667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.162342 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.162373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.164648 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.200050 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.258379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"auto-csr-approver-29567154-4z2rq\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.360575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"auto-csr-approver-29567154-4z2rq\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.382372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"auto-csr-approver-29567154-4z2rq\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.490563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:00 crc kubenswrapper[4795]: I0320 17:54:00.982115 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 17:54:00 crc kubenswrapper[4795]: W0320 17:54:00.986921 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61821949_5c88_4f4c_adab_b93269540a03.slice/crio-84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb WatchSource:0}: Error finding container 84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb: Status 404 returned error can't find the container with id 84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb Mar 20 17:54:01 crc kubenswrapper[4795]: I0320 17:54:01.029674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" event={"ID":"61821949-5c88-4f4c-adab-b93269540a03","Type":"ContainerStarted","Data":"84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb"} Mar 20 17:54:02 crc kubenswrapper[4795]: I0320 17:54:02.792792 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:02 crc kubenswrapper[4795]: I0320 17:54:02.793761 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:02 crc kubenswrapper[4795]: I0320 17:54:02.851671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.051676 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c737290-0616-475b-a839-cca387d8d90d" containerID="ce3704459df378ee719ee4944060e7d6335ef9231a60476865a774b316275a14" exitCode=0 Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.051757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerDied","Data":"ce3704459df378ee719ee4944060e7d6335ef9231a60476865a774b316275a14"} Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.054467 4795 generic.go:334] "Generic (PLEG): container finished" podID="61821949-5c88-4f4c-adab-b93269540a03" containerID="6f68cab9e191fff6af7e246da5293fa0fd1c14c356566586bb75900bd179fcf6" exitCode=0 Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.054538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" event={"ID":"61821949-5c88-4f4c-adab-b93269540a03","Type":"ContainerDied","Data":"6f68cab9e191fff6af7e246da5293fa0fd1c14c356566586bb75900bd179fcf6"} Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.138576 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:03 crc kubenswrapper[4795]: I0320 17:54:03.189550 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.440309 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.539904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") pod \"61821949-5c88-4f4c-adab-b93269540a03\" (UID: \"61821949-5c88-4f4c-adab-b93269540a03\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.545849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj" (OuterVolumeSpecName: "kube-api-access-g89wj") pod "61821949-5c88-4f4c-adab-b93269540a03" (UID: "61821949-5c88-4f4c-adab-b93269540a03"). InnerVolumeSpecName "kube-api-access-g89wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.597843 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") pod \"6c737290-0616-475b-a839-cca387d8d90d\" (UID: \"6c737290-0616-475b-a839-cca387d8d90d\") " Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.641920 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g89wj\" (UniqueName: \"kubernetes.io/projected/61821949-5c88-4f4c-adab-b93269540a03-kube-api-access-g89wj\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.648866 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f" (OuterVolumeSpecName: "kube-api-access-qqp2f") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "kube-api-access-qqp2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.648886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.667660 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.668629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory" (OuterVolumeSpecName: "inventory") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.689227 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c737290-0616-475b-a839-cca387d8d90d" (UID: "6c737290-0616-475b-a839-cca387d8d90d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743221 4795 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c737290-0616-475b-a839-cca387d8d90d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743273 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743292 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743312 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqp2f\" (UniqueName: \"kubernetes.io/projected/6c737290-0616-475b-a839-cca387d8d90d-kube-api-access-qqp2f\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4795]: I0320 17:54:04.743331 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c737290-0616-475b-a839-cca387d8d90d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.087370 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.087365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9jw45" event={"ID":"6c737290-0616-475b-a839-cca387d8d90d","Type":"ContainerDied","Data":"885fd0d01e15e8e73814661bd573c22dd2c0eba50edad204e7a8f73bdb2ca832"} Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.087588 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885fd0d01e15e8e73814661bd573c22dd2c0eba50edad204e7a8f73bdb2ca832" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" event={"ID":"61821949-5c88-4f4c-adab-b93269540a03","Type":"ContainerDied","Data":"84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb"} Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090838 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84297b6f90e060c6cc27dee62c3208dd416eef5e5bbd690c5df7f26c90a61ecb" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090779 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-4z2rq" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.090958 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t7p7j" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" containerID="cri-o://e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" gracePeriod=2 Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.207732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7"] Mar 20 17:54:05 crc kubenswrapper[4795]: E0320 17:54:05.208147 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61821949-5c88-4f4c-adab-b93269540a03" containerName="oc" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208163 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="61821949-5c88-4f4c-adab-b93269540a03" containerName="oc" Mar 20 17:54:05 crc kubenswrapper[4795]: E0320 17:54:05.208202 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c737290-0616-475b-a839-cca387d8d90d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208211 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c737290-0616-475b-a839-cca387d8d90d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208428 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="61821949-5c88-4f4c-adab-b93269540a03" containerName="oc" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.208459 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c737290-0616-475b-a839-cca387d8d90d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.209257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.213798 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.215994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.216281 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.216600 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.216824 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.217019 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.224754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7"] Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.255759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.358885 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.368139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.371857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.372779 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.377372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.377462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.378531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.515797 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.523787 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-nqw6d"] Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.541252 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.608291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.670203 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") pod \"c0450211-15da-4926-9f2c-f1169ac44b02\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.670248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") pod \"c0450211-15da-4926-9f2c-f1169ac44b02\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.670326 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") pod \"c0450211-15da-4926-9f2c-f1169ac44b02\" (UID: \"c0450211-15da-4926-9f2c-f1169ac44b02\") " Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.677480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities" (OuterVolumeSpecName: "utilities") pod "c0450211-15da-4926-9f2c-f1169ac44b02" (UID: "c0450211-15da-4926-9f2c-f1169ac44b02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.698819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l" (OuterVolumeSpecName: "kube-api-access-mjx7l") pod "c0450211-15da-4926-9f2c-f1169ac44b02" (UID: "c0450211-15da-4926-9f2c-f1169ac44b02"). InnerVolumeSpecName "kube-api-access-mjx7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.773102 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:05 crc kubenswrapper[4795]: I0320 17:54:05.773451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjx7l\" (UniqueName: \"kubernetes.io/projected/c0450211-15da-4926-9f2c-f1169ac44b02-kube-api-access-mjx7l\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114892 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0450211-15da-4926-9f2c-f1169ac44b02" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" exitCode=0 Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377"} Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t7p7j" event={"ID":"c0450211-15da-4926-9f2c-f1169ac44b02","Type":"ContainerDied","Data":"bed2375c0e8c4495026b6591fb4eeb0ebb046e32c6a24c250a89855bed8b766d"} Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.114977 4795 scope.go:117] "RemoveContainer" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.115052 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t7p7j" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.141832 4795 scope.go:117] "RemoveContainer" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.144741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0450211-15da-4926-9f2c-f1169ac44b02" (UID: "c0450211-15da-4926-9f2c-f1169ac44b02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.168894 4795 scope.go:117] "RemoveContainer" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.186298 4795 scope.go:117] "RemoveContainer" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" Mar 20 17:54:06 crc kubenswrapper[4795]: E0320 17:54:06.186720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377\": container with ID starting with e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377 not found: ID does not exist" containerID="e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.186758 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377"} err="failed to get container status \"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377\": rpc error: code = NotFound desc = could not find container \"e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377\": container with ID starting with e4504d246c0647fbd6c674dd1182104797051da3c32e8cfcbf5833441d66d377 not found: ID does not exist" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.186787 4795 scope.go:117] "RemoveContainer" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" Mar 20 17:54:06 crc kubenswrapper[4795]: E0320 17:54:06.187257 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e\": container with ID starting with c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e not found: ID does not exist" containerID="c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.187282 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e"} err="failed to get container status \"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e\": rpc error: code = NotFound desc = could not find container \"c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e\": container with ID starting with c4379007c7dac8ba6f1951dcd0332091e1bf57da5e576a96a6a0d1b688f6590e not found: ID does not exist" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.187301 4795 scope.go:117] "RemoveContainer" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" Mar 20 17:54:06 crc kubenswrapper[4795]: E0320 17:54:06.187620 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9\": container with ID starting with f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9 not found: ID does not exist" containerID="f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.187643 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9"} err="failed to get container status \"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9\": rpc error: code = NotFound desc = could not find container \"f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9\": container with ID starting with f885fd33453393f22694c8d5f7ade5ca6fc8e4b0b339287a2a11e94ee459f4f9 not found: ID does not exist" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.209002 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0450211-15da-4926-9f2c-f1169ac44b02-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.240681 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7"] Mar 20 17:54:06 crc kubenswrapper[4795]: W0320 17:54:06.242606 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode29f4857_ff0d_4806_ba09_74448200e8e2.slice/crio-cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d WatchSource:0}: Error finding container cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d: Status 404 returned error can't find the container with id cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.450007 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:54:06 crc kubenswrapper[4795]: I0320 17:54:06.458771 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t7p7j"] Mar 20 17:54:07 crc kubenswrapper[4795]: I0320 17:54:07.130264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerStarted","Data":"cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d"} Mar 20 17:54:07 crc kubenswrapper[4795]: I0320 17:54:07.270778 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5541d8b2-57fb-4162-8ee0-ac6630a5d91c" path="/var/lib/kubelet/pods/5541d8b2-57fb-4162-8ee0-ac6630a5d91c/volumes" Mar 20 17:54:07 crc kubenswrapper[4795]: I0320 17:54:07.272010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" path="/var/lib/kubelet/pods/c0450211-15da-4926-9f2c-f1169ac44b02/volumes" Mar 20 17:54:08 crc kubenswrapper[4795]: I0320 17:54:08.145405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerStarted","Data":"577e1b360b8d0a6d1805b049f234defcfede1bfedd6ea85d12e6a79faf221601"} Mar 20 17:54:08 crc kubenswrapper[4795]: I0320 17:54:08.185468 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" podStartSLOduration=2.422390864 podStartE2EDuration="3.185441706s" podCreationTimestamp="2026-03-20 17:54:05 +0000 UTC" firstStartedPulling="2026-03-20 17:54:06.246275453 +0000 UTC m=+2189.704306994" lastFinishedPulling="2026-03-20 17:54:07.009326255 +0000 UTC m=+2190.467357836" observedRunningTime="2026-03-20 17:54:08.17337884 +0000 UTC m=+2191.631410461" watchObservedRunningTime="2026-03-20 17:54:08.185441706 +0000 UTC m=+2191.643473277" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.300440 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.300894 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.300970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.302100 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:54:11 crc kubenswrapper[4795]: I0320 17:54:11.302188 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5" gracePeriod=600 Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.189794 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5" exitCode=0 Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.189861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5"} Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.190504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f"} Mar 20 17:54:12 crc kubenswrapper[4795]: I0320 17:54:12.190528 4795 scope.go:117] "RemoveContainer" containerID="59a63ed3e2204e95fd2e3d521e00a6b3861d7df6add1303c3e8a1035433dcd48" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.690892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:37 crc kubenswrapper[4795]: E0320 17:54:37.691813 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.691827 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" Mar 20 17:54:37 crc kubenswrapper[4795]: E0320 17:54:37.691846 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-utilities" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.691855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-utilities" Mar 20 17:54:37 crc kubenswrapper[4795]: E0320 17:54:37.691882 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-content" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.691891 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="extract-content" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.692107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0450211-15da-4926-9f2c-f1169ac44b02" containerName="registry-server" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.698266 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.713279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.773975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-catalog-content\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.774616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-utilities\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.774845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wb6\" (UniqueName: \"kubernetes.io/projected/58169b5e-ad5e-4928-8511-1677518e9c01-kube-api-access-j2wb6\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.876445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wb6\" (UniqueName: \"kubernetes.io/projected/58169b5e-ad5e-4928-8511-1677518e9c01-kube-api-access-j2wb6\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.876514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-catalog-content\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.876602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-utilities\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.877238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-catalog-content\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.877290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58169b5e-ad5e-4928-8511-1677518e9c01-utilities\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:37 crc kubenswrapper[4795]: I0320 17:54:37.908962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wb6\" (UniqueName: \"kubernetes.io/projected/58169b5e-ad5e-4928-8511-1677518e9c01-kube-api-access-j2wb6\") pod \"certified-operators-965kf\" (UID: \"58169b5e-ad5e-4928-8511-1677518e9c01\") " pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:38 crc kubenswrapper[4795]: I0320 17:54:38.060861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:38 crc kubenswrapper[4795]: I0320 17:54:38.560098 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.490496 4795 generic.go:334] "Generic (PLEG): container finished" podID="58169b5e-ad5e-4928-8511-1677518e9c01" containerID="4a2f3baa702fca105563aa68055d3a58583a358513f90dfc80bff4554e4f2096" exitCode=0 Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.490735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerDied","Data":"4a2f3baa702fca105563aa68055d3a58583a358513f90dfc80bff4554e4f2096"} Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.490923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerStarted","Data":"7a0b6420471b9f828cd1128c605846105b818dc7f2339e52c0d06e0ac67b44b1"} Mar 20 17:54:39 crc kubenswrapper[4795]: I0320 17:54:39.493096 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:54:45 crc kubenswrapper[4795]: I0320 17:54:45.546277 4795 generic.go:334] "Generic (PLEG): container finished" podID="58169b5e-ad5e-4928-8511-1677518e9c01" containerID="7c278ea905649fc7c9070ed011d683ccaad85e86e73b928da4c656a69cedf2d6" exitCode=0 Mar 20 17:54:45 crc kubenswrapper[4795]: I0320 17:54:45.546472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerDied","Data":"7c278ea905649fc7c9070ed011d683ccaad85e86e73b928da4c656a69cedf2d6"} Mar 20 17:54:46 crc kubenswrapper[4795]: I0320 17:54:46.557007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-965kf" event={"ID":"58169b5e-ad5e-4928-8511-1677518e9c01","Type":"ContainerStarted","Data":"997556791f530fd72643777860c4e6ca0bf8ee815bffe9d233a1b1f1d9372b6c"} Mar 20 17:54:46 crc kubenswrapper[4795]: I0320 17:54:46.579964 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-965kf" podStartSLOduration=3.009004363 podStartE2EDuration="9.579941447s" podCreationTimestamp="2026-03-20 17:54:37 +0000 UTC" firstStartedPulling="2026-03-20 17:54:39.492815854 +0000 UTC m=+2222.950847405" lastFinishedPulling="2026-03-20 17:54:46.063752948 +0000 UTC m=+2229.521784489" observedRunningTime="2026-03-20 17:54:46.579032848 +0000 UTC m=+2230.037064389" watchObservedRunningTime="2026-03-20 17:54:46.579941447 +0000 UTC m=+2230.037973008" Mar 20 17:54:48 crc kubenswrapper[4795]: I0320 17:54:48.061938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:48 crc kubenswrapper[4795]: I0320 17:54:48.062340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:48 crc kubenswrapper[4795]: I0320 17:54:48.118804 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:54 crc kubenswrapper[4795]: I0320 17:54:54.367449 4795 scope.go:117] "RemoveContainer" containerID="29ff925c9ace295b0e664d8ded17085588e84c04311df585ee77cb9a00150d0d" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.155076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-965kf" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.262217 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-965kf"] Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.322177 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.322470 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94mw5" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" containerID="cri-o://4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2" gracePeriod=2 Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732172 4795 generic.go:334] "Generic (PLEG): container finished" podID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerID="4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2" exitCode=0 Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2"} Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94mw5" event={"ID":"a79f11dc-5b5e-4929-9a6f-281ade73c24a","Type":"ContainerDied","Data":"5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd"} Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.732519 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fce8bc30342d1350c917a4ec214edeb37188f99757b74fd3902c54cd2292edd" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.782016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.836945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") pod \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.836998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") pod \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.842301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz" (OuterVolumeSpecName: "kube-api-access-th9xz") pod "a79f11dc-5b5e-4929-9a6f-281ade73c24a" (UID: "a79f11dc-5b5e-4929-9a6f-281ade73c24a"). InnerVolumeSpecName "kube-api-access-th9xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.855059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") pod \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\" (UID: \"a79f11dc-5b5e-4929-9a6f-281ade73c24a\") " Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.855386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities" (OuterVolumeSpecName: "utilities") pod "a79f11dc-5b5e-4929-9a6f-281ade73c24a" (UID: "a79f11dc-5b5e-4929-9a6f-281ade73c24a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.856011 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th9xz\" (UniqueName: \"kubernetes.io/projected/a79f11dc-5b5e-4929-9a6f-281ade73c24a-kube-api-access-th9xz\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.856035 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.888473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a79f11dc-5b5e-4929-9a6f-281ade73c24a" (UID: "a79f11dc-5b5e-4929-9a6f-281ade73c24a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:58 crc kubenswrapper[4795]: I0320 17:54:58.957460 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79f11dc-5b5e-4929-9a6f-281ade73c24a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:59 crc kubenswrapper[4795]: I0320 17:54:59.738921 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94mw5" Mar 20 17:54:59 crc kubenswrapper[4795]: I0320 17:54:59.761722 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:54:59 crc kubenswrapper[4795]: I0320 17:54:59.772796 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94mw5"] Mar 20 17:55:00 crc kubenswrapper[4795]: I0320 17:55:00.747804 4795 generic.go:334] "Generic (PLEG): container finished" podID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerID="577e1b360b8d0a6d1805b049f234defcfede1bfedd6ea85d12e6a79faf221601" exitCode=0 Mar 20 17:55:00 crc kubenswrapper[4795]: I0320 17:55:00.747992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerDied","Data":"577e1b360b8d0a6d1805b049f234defcfede1bfedd6ea85d12e6a79faf221601"} Mar 20 17:55:01 crc kubenswrapper[4795]: I0320 17:55:01.269430 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" path="/var/lib/kubelet/pods/a79f11dc-5b5e-4929-9a6f-281ade73c24a/volumes" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.265022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.323335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e29f4857-ff0d-4806-ba09-74448200e8e2\" (UID: \"e29f4857-ff0d-4806-ba09-74448200e8e2\") " Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.328984 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.329908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2" (OuterVolumeSpecName: "kube-api-access-4rrq2") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "kube-api-access-4rrq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.357017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.359324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.362862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory" (OuterVolumeSpecName: "inventory") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.374291 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e29f4857-ff0d-4806-ba09-74448200e8e2" (UID: "e29f4857-ff0d-4806-ba09-74448200e8e2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426184 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426230 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426244 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426260 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rrq2\" (UniqueName: \"kubernetes.io/projected/e29f4857-ff0d-4806-ba09-74448200e8e2-kube-api-access-4rrq2\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426272 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.426285 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e29f4857-ff0d-4806-ba09-74448200e8e2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.773293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" event={"ID":"e29f4857-ff0d-4806-ba09-74448200e8e2","Type":"ContainerDied","Data":"cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d"} Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.773344 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab1ba7e3d0b7f779cd029adf4bb5f1a966c1a35930a30adf2c6d6649ec8985d" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.773982 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.886560 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q"] Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887290 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887322 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887352 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-utilities" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-utilities" Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887389 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887397 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" Mar 20 17:55:02 crc kubenswrapper[4795]: E0320 17:55:02.887414 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-content" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887422 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="extract-content" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887635 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79f11dc-5b5e-4929-9a6f-281ade73c24a" containerName="registry-server" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.887654 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29f4857-ff0d-4806-ba09-74448200e8e2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.889016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892034 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892216 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892782 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.892967 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.893050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.913603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q"] Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.939878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:02 crc kubenswrapper[4795]: I0320 17:55:02.940762 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.043382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.043918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.044173 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.044448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.044777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.048875 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.049838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.050111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.050426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.065762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp77q\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.207861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:55:03 crc kubenswrapper[4795]: I0320 17:55:03.798479 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q"] Mar 20 17:55:04 crc kubenswrapper[4795]: I0320 17:55:04.789645 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerStarted","Data":"916d45486bc7c9429b47d1621e1445553bb690d12dcaa2aceb7cbd80e6648c0c"} Mar 20 17:55:04 crc kubenswrapper[4795]: I0320 17:55:04.790286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerStarted","Data":"ce67ae3c6d3b823d115ffd8cc57bd8b2b930ee03e39a23bfb62f0f80486fdc98"} Mar 20 17:55:04 crc kubenswrapper[4795]: I0320 17:55:04.809321 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" podStartSLOduration=2.3938161989999998 podStartE2EDuration="2.809302062s" podCreationTimestamp="2026-03-20 17:55:02 +0000 UTC" firstStartedPulling="2026-03-20 17:55:03.80564339 +0000 UTC m=+2247.263674931" lastFinishedPulling="2026-03-20 17:55:04.221129253 +0000 UTC m=+2247.679160794" observedRunningTime="2026-03-20 17:55:04.804893875 +0000 UTC m=+2248.262925436" watchObservedRunningTime="2026-03-20 17:55:04.809302062 +0000 UTC m=+2248.267333613" Mar 20 17:55:54 crc kubenswrapper[4795]: I0320 17:55:54.459337 4795 scope.go:117] "RemoveContainer" containerID="ec6a69189563a780b942ae970e8e1801846953cabcf1239c190354a1203053b4" Mar 20 17:55:54 crc kubenswrapper[4795]: I0320 17:55:54.502746 4795 scope.go:117] "RemoveContainer" containerID="4e8b0e1259002a3662a0cddeb6fd2fae0a2ae00aa800f638b73b2262d55bedd2" Mar 20 17:55:54 crc kubenswrapper[4795]: I0320 17:55:54.588411 4795 scope.go:117] "RemoveContainer" containerID="88b194a74064309f622b8e25f76f210948d20e5936b41beb91453d2773fb7483" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.166241 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.169698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.176764 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.177488 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.178558 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.198547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.200125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"auto-csr-approver-29567156-cp2gz\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.301926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"auto-csr-approver-29567156-cp2gz\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.332854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"auto-csr-approver-29567156-cp2gz\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:00 crc kubenswrapper[4795]: I0320 17:56:00.508565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:01 crc kubenswrapper[4795]: I0320 17:56:01.003432 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 17:56:01 crc kubenswrapper[4795]: I0320 17:56:01.467797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" event={"ID":"23b9cffc-8f64-481b-9f51-334e3e04ed7b","Type":"ContainerStarted","Data":"a30f48bfe88311e7f2682f264a4f3bc2e92038fc2943c09e9fcd3b4e26c94210"} Mar 20 17:56:03 crc kubenswrapper[4795]: I0320 17:56:03.492479 4795 generic.go:334] "Generic (PLEG): container finished" podID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerID="025d16245a433259b825961c6fb9d8ed0412608aa4b43ab349fe67ca35e229a7" exitCode=0 Mar 20 17:56:03 crc kubenswrapper[4795]: I0320 17:56:03.492617 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" event={"ID":"23b9cffc-8f64-481b-9f51-334e3e04ed7b","Type":"ContainerDied","Data":"025d16245a433259b825961c6fb9d8ed0412608aa4b43ab349fe67ca35e229a7"} Mar 20 17:56:04 crc kubenswrapper[4795]: I0320 17:56:04.908112 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.004510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") pod \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\" (UID: \"23b9cffc-8f64-481b-9f51-334e3e04ed7b\") " Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.010649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6" (OuterVolumeSpecName: "kube-api-access-jwxd6") pod "23b9cffc-8f64-481b-9f51-334e3e04ed7b" (UID: "23b9cffc-8f64-481b-9f51-334e3e04ed7b"). InnerVolumeSpecName "kube-api-access-jwxd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.106954 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwxd6\" (UniqueName: \"kubernetes.io/projected/23b9cffc-8f64-481b-9f51-334e3e04ed7b-kube-api-access-jwxd6\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.515960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" event={"ID":"23b9cffc-8f64-481b-9f51-334e3e04ed7b","Type":"ContainerDied","Data":"a30f48bfe88311e7f2682f264a4f3bc2e92038fc2943c09e9fcd3b4e26c94210"} Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.516016 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30f48bfe88311e7f2682f264a4f3bc2e92038fc2943c09e9fcd3b4e26c94210" Mar 20 17:56:05 crc kubenswrapper[4795]: I0320 17:56:05.516037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-cp2gz" Mar 20 17:56:06 crc kubenswrapper[4795]: I0320 17:56:06.002894 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:56:06 crc kubenswrapper[4795]: I0320 17:56:06.012846 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-glzlx"] Mar 20 17:56:07 crc kubenswrapper[4795]: I0320 17:56:07.269010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e628e06-ee48-4969-824e-fba400b67d3a" path="/var/lib/kubelet/pods/5e628e06-ee48-4969-824e-fba400b67d3a/volumes" Mar 20 17:56:11 crc kubenswrapper[4795]: I0320 17:56:11.300135 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:56:11 crc kubenswrapper[4795]: I0320 17:56:11.300243 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.897473 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:25 crc kubenswrapper[4795]: E0320 17:56:25.899272 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerName="oc" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.899310 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerName="oc" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.899792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" containerName="oc" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.902888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.945332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.983297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.983719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:25 crc kubenswrapper[4795]: I0320 17:56:25.983749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.085909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.086840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.112941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"redhat-marketplace-67sls\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.239222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.714923 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:26 crc kubenswrapper[4795]: I0320 17:56:26.770289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerStarted","Data":"7671c9adfde4a05a6d9e863366f8c916de7cad7cd8a8af69c46501654cef5383"} Mar 20 17:56:27 crc kubenswrapper[4795]: I0320 17:56:27.782801 4795 generic.go:334] "Generic (PLEG): container finished" podID="98c1605e-8284-489e-83f0-bab45156e299" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" exitCode=0 Mar 20 17:56:27 crc kubenswrapper[4795]: I0320 17:56:27.782909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c"} Mar 20 17:56:28 crc kubenswrapper[4795]: I0320 17:56:28.793526 4795 generic.go:334] "Generic (PLEG): container finished" podID="98c1605e-8284-489e-83f0-bab45156e299" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" exitCode=0 Mar 20 17:56:28 crc kubenswrapper[4795]: I0320 17:56:28.794561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d"} Mar 20 17:56:29 crc kubenswrapper[4795]: I0320 17:56:29.806710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerStarted","Data":"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe"} Mar 20 17:56:29 crc kubenswrapper[4795]: I0320 17:56:29.836062 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67sls" podStartSLOduration=3.4231254939999998 podStartE2EDuration="4.836038818s" podCreationTimestamp="2026-03-20 17:56:25 +0000 UTC" firstStartedPulling="2026-03-20 17:56:27.784909403 +0000 UTC m=+2331.242940974" lastFinishedPulling="2026-03-20 17:56:29.197822757 +0000 UTC m=+2332.655854298" observedRunningTime="2026-03-20 17:56:29.824816248 +0000 UTC m=+2333.282847789" watchObservedRunningTime="2026-03-20 17:56:29.836038818 +0000 UTC m=+2333.294070359" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.239813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.240441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.288460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:36 crc kubenswrapper[4795]: I0320 17:56:36.962741 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.117160 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.117726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67sls" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" containerID="cri-o://29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" gracePeriod=2 Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.605653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.714936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") pod \"98c1605e-8284-489e-83f0-bab45156e299\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.715167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") pod \"98c1605e-8284-489e-83f0-bab45156e299\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.715247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") pod \"98c1605e-8284-489e-83f0-bab45156e299\" (UID: \"98c1605e-8284-489e-83f0-bab45156e299\") " Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.715971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities" (OuterVolumeSpecName: "utilities") pod "98c1605e-8284-489e-83f0-bab45156e299" (UID: "98c1605e-8284-489e-83f0-bab45156e299"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.720970 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk" (OuterVolumeSpecName: "kube-api-access-jwskk") pod "98c1605e-8284-489e-83f0-bab45156e299" (UID: "98c1605e-8284-489e-83f0-bab45156e299"). InnerVolumeSpecName "kube-api-access-jwskk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.743674 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98c1605e-8284-489e-83f0-bab45156e299" (UID: "98c1605e-8284-489e-83f0-bab45156e299"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.817850 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.817887 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwskk\" (UniqueName: \"kubernetes.io/projected/98c1605e-8284-489e-83f0-bab45156e299-kube-api-access-jwskk\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.817902 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c1605e-8284-489e-83f0-bab45156e299-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941282 4795 generic.go:334] "Generic (PLEG): container finished" podID="98c1605e-8284-489e-83f0-bab45156e299" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" exitCode=0 Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe"} Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67sls" event={"ID":"98c1605e-8284-489e-83f0-bab45156e299","Type":"ContainerDied","Data":"7671c9adfde4a05a6d9e863366f8c916de7cad7cd8a8af69c46501654cef5383"} Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941401 4795 scope.go:117] "RemoveContainer" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.941340 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67sls" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.986560 4795 scope.go:117] "RemoveContainer" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" Mar 20 17:56:40 crc kubenswrapper[4795]: I0320 17:56:40.989331 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.008177 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67sls"] Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.033024 4795 scope.go:117] "RemoveContainer" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.072425 4795 scope.go:117] "RemoveContainer" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" Mar 20 17:56:41 crc kubenswrapper[4795]: E0320 17:56:41.073077 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe\": container with ID starting with 29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe not found: ID does not exist" containerID="29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073138 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe"} err="failed to get container status \"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe\": rpc error: code = NotFound desc = could not find container \"29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe\": container with ID starting with 29604e4cfe4230947582a742499dffa0c8405ebde804c5692e3721cb766ccbbe not found: ID does not exist" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073178 4795 scope.go:117] "RemoveContainer" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" Mar 20 17:56:41 crc kubenswrapper[4795]: E0320 17:56:41.073719 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d\": container with ID starting with 7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d not found: ID does not exist" containerID="7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073751 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d"} err="failed to get container status \"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d\": rpc error: code = NotFound desc = could not find container \"7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d\": container with ID starting with 7ddabe925c64f498930da9f4642367d2e4aa80f2a3892548e808f579a3bde62d not found: ID does not exist" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.073772 4795 scope.go:117] "RemoveContainer" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" Mar 20 17:56:41 crc kubenswrapper[4795]: E0320 17:56:41.074120 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c\": container with ID starting with a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c not found: ID does not exist" containerID="a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.074190 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c"} err="failed to get container status \"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c\": rpc error: code = NotFound desc = could not find container \"a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c\": container with ID starting with a49d36937b82db91caa89d039d9fa25b32a91cfaf295eaa9df03715fa8a3b23c not found: ID does not exist" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.273857 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c1605e-8284-489e-83f0-bab45156e299" path="/var/lib/kubelet/pods/98c1605e-8284-489e-83f0-bab45156e299/volumes" Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.300970 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:56:41 crc kubenswrapper[4795]: I0320 17:56:41.301046 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:56:54 crc kubenswrapper[4795]: I0320 17:56:54.679273 4795 scope.go:117] "RemoveContainer" containerID="b0e8f1ce702c9e1cfb11740285e904a1e8d1f711ef3e97850efbb6236da59523" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.300225 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.300944 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.301010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.302108 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:57:11 crc kubenswrapper[4795]: I0320 17:57:11.302203 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" gracePeriod=600 Mar 20 17:57:11 crc kubenswrapper[4795]: E0320 17:57:11.429140 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.321196 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" exitCode=0 Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.321280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f"} Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.322461 4795 scope.go:117] "RemoveContainer" containerID="f478057908514f404898e9faf2bfe722ba3bd34e6544b94c8821577bb2d2d3a5" Mar 20 17:57:12 crc kubenswrapper[4795]: I0320 17:57:12.323296 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:12 crc kubenswrapper[4795]: E0320 17:57:12.323753 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:27 crc kubenswrapper[4795]: I0320 17:57:27.260588 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:27 crc kubenswrapper[4795]: E0320 17:57:27.261757 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:38 crc kubenswrapper[4795]: I0320 17:57:38.252629 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:38 crc kubenswrapper[4795]: E0320 17:57:38.253938 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:57:52 crc kubenswrapper[4795]: I0320 17:57:52.252335 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:57:52 crc kubenswrapper[4795]: E0320 17:57:52.253370 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.153353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 17:58:00 crc kubenswrapper[4795]: E0320 17:58:00.154312 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-content" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154331 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-content" Mar 20 17:58:00 crc kubenswrapper[4795]: E0320 17:58:00.154361 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154370 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" Mar 20 17:58:00 crc kubenswrapper[4795]: E0320 17:58:00.154387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-utilities" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154395 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="extract-utilities" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.154616 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c1605e-8284-489e-83f0-bab45156e299" containerName="registry-server" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.155413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.158890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.160154 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.161501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.162464 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.249394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"auto-csr-approver-29567158-45rzl\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.350918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"auto-csr-approver-29567158-45rzl\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.381247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"auto-csr-approver-29567158-45rzl\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.492749 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:00 crc kubenswrapper[4795]: W0320 17:58:00.930454 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08fd2b90_3d66_4e64_bbb0_c4eaf75e0aed.slice/crio-2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32 WatchSource:0}: Error finding container 2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32: Status 404 returned error can't find the container with id 2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32 Mar 20 17:58:00 crc kubenswrapper[4795]: I0320 17:58:00.934835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 17:58:01 crc kubenswrapper[4795]: I0320 17:58:01.846044 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerStarted","Data":"2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32"} Mar 20 17:58:02 crc kubenswrapper[4795]: I0320 17:58:02.858966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerStarted","Data":"62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a"} Mar 20 17:58:02 crc kubenswrapper[4795]: I0320 17:58:02.881403 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567158-45rzl" podStartSLOduration=1.5707947199999999 podStartE2EDuration="2.881379475s" podCreationTimestamp="2026-03-20 17:58:00 +0000 UTC" firstStartedPulling="2026-03-20 17:58:00.934979433 +0000 UTC m=+2424.393011004" lastFinishedPulling="2026-03-20 17:58:02.245564208 +0000 UTC m=+2425.703595759" observedRunningTime="2026-03-20 17:58:02.873890842 +0000 UTC m=+2426.331922403" watchObservedRunningTime="2026-03-20 17:58:02.881379475 +0000 UTC m=+2426.339411046" Mar 20 17:58:03 crc kubenswrapper[4795]: I0320 17:58:03.871713 4795 generic.go:334] "Generic (PLEG): container finished" podID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerID="62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a" exitCode=0 Mar 20 17:58:03 crc kubenswrapper[4795]: I0320 17:58:03.871796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerDied","Data":"62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a"} Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.216791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.353538 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") pod \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\" (UID: \"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed\") " Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.359899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf" (OuterVolumeSpecName: "kube-api-access-t5qxf") pod "08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" (UID: "08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed"). InnerVolumeSpecName "kube-api-access-t5qxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.456888 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5qxf\" (UniqueName: \"kubernetes.io/projected/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed-kube-api-access-t5qxf\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.893156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-45rzl" event={"ID":"08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed","Type":"ContainerDied","Data":"2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32"} Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.893210 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fe15b1ce3d728566313b4b3291accfff84366bc333c9420ca826d56db97bb32" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.893284 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-45rzl" Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.965594 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:58:05 crc kubenswrapper[4795]: I0320 17:58:05.975570 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-ksnjt"] Mar 20 17:58:06 crc kubenswrapper[4795]: I0320 17:58:06.252453 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:06 crc kubenswrapper[4795]: E0320 17:58:06.252942 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:07 crc kubenswrapper[4795]: I0320 17:58:07.264498 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8abf4de-a372-47df-b14c-490f1e084a56" path="/var/lib/kubelet/pods/d8abf4de-a372-47df-b14c-490f1e084a56/volumes" Mar 20 17:58:20 crc kubenswrapper[4795]: I0320 17:58:20.252754 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:20 crc kubenswrapper[4795]: E0320 17:58:20.253841 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:35 crc kubenswrapper[4795]: I0320 17:58:35.251847 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:35 crc kubenswrapper[4795]: E0320 17:58:35.252439 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:50 crc kubenswrapper[4795]: I0320 17:58:50.252852 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:58:50 crc kubenswrapper[4795]: E0320 17:58:50.253670 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:58:54 crc kubenswrapper[4795]: I0320 17:58:54.842918 4795 scope.go:117] "RemoveContainer" containerID="fa7c9e74af14d50a1c364d101636ba64da237edd40eebd00160c638dba974672" Mar 20 17:59:05 crc kubenswrapper[4795]: I0320 17:59:05.253161 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:05 crc kubenswrapper[4795]: E0320 17:59:05.254283 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:16 crc kubenswrapper[4795]: I0320 17:59:16.252759 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:16 crc kubenswrapper[4795]: E0320 17:59:16.253898 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:20 crc kubenswrapper[4795]: I0320 17:59:20.842316 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerID="916d45486bc7c9429b47d1621e1445553bb690d12dcaa2aceb7cbd80e6648c0c" exitCode=0 Mar 20 17:59:20 crc kubenswrapper[4795]: I0320 17:59:20.842667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerDied","Data":"916d45486bc7c9429b47d1621e1445553bb690d12dcaa2aceb7cbd80e6648c0c"} Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.361124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.427760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") pod \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\" (UID: \"b6da9d2a-e18f-4994-b8f3-6b1eb969564b\") " Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.433205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7" (OuterVolumeSpecName: "kube-api-access-dshp7") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "kube-api-access-dshp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.441878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.462220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.463235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory" (OuterVolumeSpecName: "inventory") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.468893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b6da9d2a-e18f-4994-b8f3-6b1eb969564b" (UID: "b6da9d2a-e18f-4994-b8f3-6b1eb969564b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529388 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529420 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529431 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dshp7\" (UniqueName: \"kubernetes.io/projected/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-kube-api-access-dshp7\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.529449 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6da9d2a-e18f-4994-b8f3-6b1eb969564b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.863610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" event={"ID":"b6da9d2a-e18f-4994-b8f3-6b1eb969564b","Type":"ContainerDied","Data":"ce67ae3c6d3b823d115ffd8cc57bd8b2b930ee03e39a23bfb62f0f80486fdc98"} Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.864041 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce67ae3c6d3b823d115ffd8cc57bd8b2b930ee03e39a23bfb62f0f80486fdc98" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.863728 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp77q" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.962415 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx"] Mar 20 17:59:22 crc kubenswrapper[4795]: E0320 17:59:22.962808 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerName="oc" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.962822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerName="oc" Mar 20 17:59:22 crc kubenswrapper[4795]: E0320 17:59:22.962838 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.962845 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.963009 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" containerName="oc" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.963030 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6da9d2a-e18f-4994-b8f3-6b1eb969564b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.964026 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.966115 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.966115 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.966733 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969391 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969625 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.969937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:59:22 crc kubenswrapper[4795]: I0320 17:59:22.994308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx"] Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.036995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.037403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.139570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.140516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.143840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.143863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.144749 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.145047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.145654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.145909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.147376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.149171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.150380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.167581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kc4wx\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.287230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 17:59:23 crc kubenswrapper[4795]: I0320 17:59:23.917752 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx"] Mar 20 17:59:24 crc kubenswrapper[4795]: I0320 17:59:24.882173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerStarted","Data":"e18c048a597dc4d9d215e2e77b0cde124882e9a2ee7a96faf5c2c8ceff8b067d"} Mar 20 17:59:25 crc kubenswrapper[4795]: I0320 17:59:25.892083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerStarted","Data":"0c4a86a4e0bd9983c0776f522d0ac2f129ecc0ae67cd8e75fe2c36c5fe922436"} Mar 20 17:59:25 crc kubenswrapper[4795]: I0320 17:59:25.925717 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" podStartSLOduration=3.139207976 podStartE2EDuration="3.925669226s" podCreationTimestamp="2026-03-20 17:59:22 +0000 UTC" firstStartedPulling="2026-03-20 17:59:23.919337796 +0000 UTC m=+2507.377369347" lastFinishedPulling="2026-03-20 17:59:24.705799016 +0000 UTC m=+2508.163830597" observedRunningTime="2026-03-20 17:59:25.91681053 +0000 UTC m=+2509.374842081" watchObservedRunningTime="2026-03-20 17:59:25.925669226 +0000 UTC m=+2509.383700787" Mar 20 17:59:27 crc kubenswrapper[4795]: I0320 17:59:27.267636 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:27 crc kubenswrapper[4795]: E0320 17:59:27.268143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:42 crc kubenswrapper[4795]: I0320 17:59:42.252851 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:42 crc kubenswrapper[4795]: E0320 17:59:42.253822 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 17:59:55 crc kubenswrapper[4795]: I0320 17:59:55.253181 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 17:59:55 crc kubenswrapper[4795]: E0320 17:59:55.254168 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.174041 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.176186 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.180468 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.180898 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.182821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.182903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.183161 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.192799 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.263195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.264754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.267711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.267807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.269349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.286276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"auto-csr-approver-29567160-k8xrk\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.286845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.286976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.287040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.289137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.293835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.305683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.310481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"collect-profiles-29567160-pnnff\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.389171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"auto-csr-approver-29567160-k8xrk\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.413949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"auto-csr-approver-29567160-k8xrk\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.502126 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:00 crc kubenswrapper[4795]: I0320 18:00:00.588799 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:01 crc kubenswrapper[4795]: W0320 18:00:01.009534 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06aca85b_9cb4_47ae_ad12_b1cc429c542d.slice/crio-25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2 WatchSource:0}: Error finding container 25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2: Status 404 returned error can't find the container with id 25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2 Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.013367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.101399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:00:01 crc kubenswrapper[4795]: W0320 18:00:01.101936 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda01c5ccc_0cea_415d_969d_64f17a21036b.slice/crio-c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a WatchSource:0}: Error finding container c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a: Status 404 returned error can't find the container with id c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.104698 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.294007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerStarted","Data":"1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e"} Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.294342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerStarted","Data":"25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2"} Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.295566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" event={"ID":"a01c5ccc-0cea-415d-969d-64f17a21036b","Type":"ContainerStarted","Data":"c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a"} Mar 20 18:00:01 crc kubenswrapper[4795]: I0320 18:00:01.330551 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" podStartSLOduration=1.330530772 podStartE2EDuration="1.330530772s" podCreationTimestamp="2026-03-20 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:00:01.322037568 +0000 UTC m=+2544.780069109" watchObservedRunningTime="2026-03-20 18:00:01.330530772 +0000 UTC m=+2544.788562313" Mar 20 18:00:02 crc kubenswrapper[4795]: I0320 18:00:02.306779 4795 generic.go:334] "Generic (PLEG): container finished" podID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerID="1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e" exitCode=0 Mar 20 18:00:02 crc kubenswrapper[4795]: I0320 18:00:02.306841 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerDied","Data":"1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e"} Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.656999 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.851132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") pod \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.851234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") pod \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.851374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") pod \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\" (UID: \"06aca85b-9cb4-47ae-ad12-b1cc429c542d\") " Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.852312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume" (OuterVolumeSpecName: "config-volume") pod "06aca85b-9cb4-47ae-ad12-b1cc429c542d" (UID: "06aca85b-9cb4-47ae-ad12-b1cc429c542d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.856898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06aca85b-9cb4-47ae-ad12-b1cc429c542d" (UID: "06aca85b-9cb4-47ae-ad12-b1cc429c542d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.861919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95" (OuterVolumeSpecName: "kube-api-access-zsz95") pod "06aca85b-9cb4-47ae-ad12-b1cc429c542d" (UID: "06aca85b-9cb4-47ae-ad12-b1cc429c542d"). InnerVolumeSpecName "kube-api-access-zsz95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.953752 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06aca85b-9cb4-47ae-ad12-b1cc429c542d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.954115 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsz95\" (UniqueName: \"kubernetes.io/projected/06aca85b-9cb4-47ae-ad12-b1cc429c542d-kube-api-access-zsz95\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4795]: I0320 18:00:03.954187 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06aca85b-9cb4-47ae-ad12-b1cc429c542d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.333808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" event={"ID":"06aca85b-9cb4-47ae-ad12-b1cc429c542d","Type":"ContainerDied","Data":"25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2"} Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.333878 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25d5f707c5181c6c65f343cadd38b9c1863e027a3d47fc5e38e6729f657319a2" Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.333881 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff" Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.418387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 18:00:04 crc kubenswrapper[4795]: I0320 18:00:04.427877 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-v58cr"] Mar 20 18:00:05 crc kubenswrapper[4795]: I0320 18:00:05.267147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="918aa57e-8c94-4427-b6bd-218a5687d684" path="/var/lib/kubelet/pods/918aa57e-8c94-4427-b6bd-218a5687d684/volumes" Mar 20 18:00:08 crc kubenswrapper[4795]: I0320 18:00:08.253189 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:08 crc kubenswrapper[4795]: E0320 18:00:08.253800 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:19 crc kubenswrapper[4795]: I0320 18:00:19.253278 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:19 crc kubenswrapper[4795]: E0320 18:00:19.254128 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:26 crc kubenswrapper[4795]: I0320 18:00:26.565762 4795 generic.go:334] "Generic (PLEG): container finished" podID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerID="476381efe19b63607c82b31ec27f9dd8365ccb9f0dd4f0e61aba026eba68e6a2" exitCode=0 Mar 20 18:00:26 crc kubenswrapper[4795]: I0320 18:00:26.565872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" event={"ID":"a01c5ccc-0cea-415d-969d-64f17a21036b","Type":"ContainerDied","Data":"476381efe19b63607c82b31ec27f9dd8365ccb9f0dd4f0e61aba026eba68e6a2"} Mar 20 18:00:27 crc kubenswrapper[4795]: I0320 18:00:27.931903 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.001139 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") pod \"a01c5ccc-0cea-415d-969d-64f17a21036b\" (UID: \"a01c5ccc-0cea-415d-969d-64f17a21036b\") " Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.006411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v" (OuterVolumeSpecName: "kube-api-access-zk24v") pod "a01c5ccc-0cea-415d-969d-64f17a21036b" (UID: "a01c5ccc-0cea-415d-969d-64f17a21036b"). InnerVolumeSpecName "kube-api-access-zk24v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.102884 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk24v\" (UniqueName: \"kubernetes.io/projected/a01c5ccc-0cea-415d-969d-64f17a21036b-kube-api-access-zk24v\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.590331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" event={"ID":"a01c5ccc-0cea-415d-969d-64f17a21036b","Type":"ContainerDied","Data":"c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a"} Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.590378 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c18933276577b8cd908e6912ca1839fb92764f1fdc5470849519e18405a6429a" Mar 20 18:00:28 crc kubenswrapper[4795]: I0320 18:00:28.590383 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-k8xrk" Mar 20 18:00:29 crc kubenswrapper[4795]: I0320 18:00:29.002605 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 18:00:29 crc kubenswrapper[4795]: I0320 18:00:29.009613 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-4z2rq"] Mar 20 18:00:29 crc kubenswrapper[4795]: I0320 18:00:29.267886 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61821949-5c88-4f4c-adab-b93269540a03" path="/var/lib/kubelet/pods/61821949-5c88-4f4c-adab-b93269540a03/volumes" Mar 20 18:00:32 crc kubenswrapper[4795]: I0320 18:00:32.252237 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:32 crc kubenswrapper[4795]: E0320 18:00:32.252941 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:47 crc kubenswrapper[4795]: I0320 18:00:47.257856 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:00:47 crc kubenswrapper[4795]: E0320 18:00:47.258519 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:00:54 crc kubenswrapper[4795]: I0320 18:00:54.955627 4795 scope.go:117] "RemoveContainer" containerID="6f68cab9e191fff6af7e246da5293fa0fd1c14c356566586bb75900bd179fcf6" Mar 20 18:00:54 crc kubenswrapper[4795]: I0320 18:00:54.997246 4795 scope.go:117] "RemoveContainer" containerID="1a29e74f6dc8f40ef08045f483f837253c35a577aa6f85ce5cd8c2a56afebf9c" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.180873 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567161-t26vc"] Mar 20 18:01:00 crc kubenswrapper[4795]: E0320 18:01:00.183155 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerName="collect-profiles" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183198 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerName="collect-profiles" Mar 20 18:01:00 crc kubenswrapper[4795]: E0320 18:01:00.183214 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerName="oc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerName="oc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183745 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" containerName="collect-profiles" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.183949 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" containerName="oc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.187586 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.189820 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-t26vc"] Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.252158 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:00 crc kubenswrapper[4795]: E0320 18:01:00.252511 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.264498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366561 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.366793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.372331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.375816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.378810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.387812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"keystone-cron-29567161-t26vc\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.527462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:00 crc kubenswrapper[4795]: I0320 18:01:00.990610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-t26vc"] Mar 20 18:01:01 crc kubenswrapper[4795]: I0320 18:01:01.880553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerStarted","Data":"91a2f5603edcd1f18f4d99385ba309c78145d20156d550a342b1889c19be8173"} Mar 20 18:01:01 crc kubenswrapper[4795]: I0320 18:01:01.880838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerStarted","Data":"dbf97ccf4d81b44dc6b8e01bef5228dd6f8628066161ee1e7b1585a28ab22358"} Mar 20 18:01:01 crc kubenswrapper[4795]: I0320 18:01:01.919085 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567161-t26vc" podStartSLOduration=1.9190604470000001 podStartE2EDuration="1.919060447s" podCreationTimestamp="2026-03-20 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:01:01.914260787 +0000 UTC m=+2605.372292338" watchObservedRunningTime="2026-03-20 18:01:01.919060447 +0000 UTC m=+2605.377091988" Mar 20 18:01:03 crc kubenswrapper[4795]: I0320 18:01:03.903402 4795 generic.go:334] "Generic (PLEG): container finished" podID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerID="91a2f5603edcd1f18f4d99385ba309c78145d20156d550a342b1889c19be8173" exitCode=0 Mar 20 18:01:03 crc kubenswrapper[4795]: I0320 18:01:03.903837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerDied","Data":"91a2f5603edcd1f18f4d99385ba309c78145d20156d550a342b1889c19be8173"} Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.294839 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375341 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.375744 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") pod \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\" (UID: \"cdfe5ffc-ab15-4277-966f-f506e725e8b1\") " Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.382878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp" (OuterVolumeSpecName: "kube-api-access-m4qsp") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "kube-api-access-m4qsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.392946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.428620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.448106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data" (OuterVolumeSpecName: "config-data") pod "cdfe5ffc-ab15-4277-966f-f506e725e8b1" (UID: "cdfe5ffc-ab15-4277-966f-f506e725e8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4qsp\" (UniqueName: \"kubernetes.io/projected/cdfe5ffc-ab15-4277-966f-f506e725e8b1-kube-api-access-m4qsp\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478820 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478835 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.478847 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfe5ffc-ab15-4277-966f-f506e725e8b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.926214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-t26vc" event={"ID":"cdfe5ffc-ab15-4277-966f-f506e725e8b1","Type":"ContainerDied","Data":"dbf97ccf4d81b44dc6b8e01bef5228dd6f8628066161ee1e7b1585a28ab22358"} Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.926248 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf97ccf4d81b44dc6b8e01bef5228dd6f8628066161ee1e7b1585a28ab22358" Mar 20 18:01:05 crc kubenswrapper[4795]: I0320 18:01:05.926290 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-t26vc" Mar 20 18:01:11 crc kubenswrapper[4795]: I0320 18:01:11.253023 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:11 crc kubenswrapper[4795]: E0320 18:01:11.253616 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:22 crc kubenswrapper[4795]: I0320 18:01:22.251987 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:22 crc kubenswrapper[4795]: E0320 18:01:22.252522 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:37 crc kubenswrapper[4795]: I0320 18:01:37.260766 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:37 crc kubenswrapper[4795]: E0320 18:01:37.261982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:01:49 crc kubenswrapper[4795]: I0320 18:01:49.252293 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:01:49 crc kubenswrapper[4795]: E0320 18:01:49.253300 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.163367 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:02:00 crc kubenswrapper[4795]: E0320 18:02:00.164307 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.164320 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.164531 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfe5ffc-ab15-4277-966f-f506e725e8b1" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.165303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.174657 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.175012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.176110 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.176882 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.327536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"auto-csr-approver-29567162-729s4\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.430492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"auto-csr-approver-29567162-729s4\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.461842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"auto-csr-approver-29567162-729s4\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.488248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:00 crc kubenswrapper[4795]: I0320 18:02:00.985830 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:02:01 crc kubenswrapper[4795]: I0320 18:02:01.433974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-729s4" event={"ID":"3f28413f-4baf-4c13-bfaa-dc76fcb80e65","Type":"ContainerStarted","Data":"5e90451ce58885bdbe5b6c79c389be731f2b6a7327ef8c4d734e1350341b943b"} Mar 20 18:02:02 crc kubenswrapper[4795]: I0320 18:02:02.445221 4795 generic.go:334] "Generic (PLEG): container finished" podID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerID="d974ca7e16de9404ab80c37c963c39d73239baae8fb9a3246b8e5e345f171158" exitCode=0 Mar 20 18:02:02 crc kubenswrapper[4795]: I0320 18:02:02.445321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-729s4" event={"ID":"3f28413f-4baf-4c13-bfaa-dc76fcb80e65","Type":"ContainerDied","Data":"d974ca7e16de9404ab80c37c963c39d73239baae8fb9a3246b8e5e345f171158"} Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.252466 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:02:03 crc kubenswrapper[4795]: E0320 18:02:03.252944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.776616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.891929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") pod \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\" (UID: \"3f28413f-4baf-4c13-bfaa-dc76fcb80e65\") " Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.899180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f" (OuterVolumeSpecName: "kube-api-access-dkt5f") pod "3f28413f-4baf-4c13-bfaa-dc76fcb80e65" (UID: "3f28413f-4baf-4c13-bfaa-dc76fcb80e65"). InnerVolumeSpecName "kube-api-access-dkt5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:03 crc kubenswrapper[4795]: I0320 18:02:03.994761 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkt5f\" (UniqueName: \"kubernetes.io/projected/3f28413f-4baf-4c13-bfaa-dc76fcb80e65-kube-api-access-dkt5f\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.462574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-729s4" event={"ID":"3f28413f-4baf-4c13-bfaa-dc76fcb80e65","Type":"ContainerDied","Data":"5e90451ce58885bdbe5b6c79c389be731f2b6a7327ef8c4d734e1350341b943b"} Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.462622 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e90451ce58885bdbe5b6c79c389be731f2b6a7327ef8c4d734e1350341b943b" Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.462676 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-729s4" Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.840607 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 18:02:04 crc kubenswrapper[4795]: I0320 18:02:04.849928 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-cp2gz"] Mar 20 18:02:05 crc kubenswrapper[4795]: I0320 18:02:05.271359 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b9cffc-8f64-481b-9f51-334e3e04ed7b" path="/var/lib/kubelet/pods/23b9cffc-8f64-481b-9f51-334e3e04ed7b/volumes" Mar 20 18:02:11 crc kubenswrapper[4795]: I0320 18:02:11.526487 4795 generic.go:334] "Generic (PLEG): container finished" podID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerID="0c4a86a4e0bd9983c0776f522d0ac2f129ecc0ae67cd8e75fe2c36c5fe922436" exitCode=0 Mar 20 18:02:11 crc kubenswrapper[4795]: I0320 18:02:11.526582 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerDied","Data":"0c4a86a4e0bd9983c0776f522d0ac2f129ecc0ae67cd8e75fe2c36c5fe922436"} Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.074185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181811 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181890 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.181922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.182084 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.182217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.182276 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") pod \"709f5080-c511-4d3b-bc9c-baeec85fa245\" (UID: \"709f5080-c511-4d3b-bc9c-baeec85fa245\") " Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.193039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.204609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n" (OuterVolumeSpecName: "kube-api-access-7nl9n") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "kube-api-access-7nl9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.217170 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.217174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.217818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.224346 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.224586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.230303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory" (OuterVolumeSpecName: "inventory") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.234237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.238589 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.243001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "709f5080-c511-4d3b-bc9c-baeec85fa245" (UID: "709f5080-c511-4d3b-bc9c-baeec85fa245"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287494 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287801 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287822 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287840 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.287857 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288000 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nl9n\" (UniqueName: \"kubernetes.io/projected/709f5080-c511-4d3b-bc9c-baeec85fa245-kube-api-access-7nl9n\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288072 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288130 4795 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288308 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288337 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.288358 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/709f5080-c511-4d3b-bc9c-baeec85fa245-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.550017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" event={"ID":"709f5080-c511-4d3b-bc9c-baeec85fa245","Type":"ContainerDied","Data":"e18c048a597dc4d9d215e2e77b0cde124882e9a2ee7a96faf5c2c8ceff8b067d"} Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.550071 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18c048a597dc4d9d215e2e77b0cde124882e9a2ee7a96faf5c2c8ceff8b067d" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.550126 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kc4wx" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.816244 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh"] Mar 20 18:02:13 crc kubenswrapper[4795]: E0320 18:02:13.831840 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832057 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: E0320 18:02:13.832131 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerName="oc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832194 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerName="oc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832445 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" containerName="oc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.832535 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="709f5080-c511-4d3b-bc9c-baeec85fa245" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.833205 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.835477 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836082 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836097 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836249 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xztkc" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.836329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.844156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh"] Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.905247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.906859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.906997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.907117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.909849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.909953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:13 crc kubenswrapper[4795]: I0320 18:02:13.910066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.010712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.010974 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011081 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.011476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.017270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.019183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.019639 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.020243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.020641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.033132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.033939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.165576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:02:14 crc kubenswrapper[4795]: I0320 18:02:14.685149 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh"] Mar 20 18:02:15 crc kubenswrapper[4795]: I0320 18:02:15.272295 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:02:15 crc kubenswrapper[4795]: I0320 18:02:15.577159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerStarted","Data":"01bbbaa61b45180cf8109cf1e889d03b1b5c7df633fdfee4f8bb470cfabf15d5"} Mar 20 18:02:16 crc kubenswrapper[4795]: I0320 18:02:16.590274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb"} Mar 20 18:02:16 crc kubenswrapper[4795]: I0320 18:02:16.592320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerStarted","Data":"82decdc91bfadfc70aba4b8aca3b29297f575d87a05741abcadb41a86f6b2aa7"} Mar 20 18:02:55 crc kubenswrapper[4795]: I0320 18:02:55.103740 4795 scope.go:117] "RemoveContainer" containerID="025d16245a433259b825961c6fb9d8ed0412608aa4b43ab349fe67ca35e229a7" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.152471 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" podStartSLOduration=106.418763605 podStartE2EDuration="1m47.152452441s" podCreationTimestamp="2026-03-20 18:02:13 +0000 UTC" firstStartedPulling="2026-03-20 18:02:14.684504769 +0000 UTC m=+2678.142536330" lastFinishedPulling="2026-03-20 18:02:15.418193615 +0000 UTC m=+2678.876225166" observedRunningTime="2026-03-20 18:02:16.645792465 +0000 UTC m=+2680.103824016" watchObservedRunningTime="2026-03-20 18:04:00.152452441 +0000 UTC m=+2783.610483982" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.155758 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.157113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.159872 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.160170 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.161376 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.189957 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.215841 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"auto-csr-approver-29567164-j2kpr\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.317930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"auto-csr-approver-29567164-j2kpr\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.339264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"auto-csr-approver-29567164-j2kpr\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.486424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:00 crc kubenswrapper[4795]: I0320 18:04:00.961511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:04:01 crc kubenswrapper[4795]: I0320 18:04:01.716445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" event={"ID":"42c32f39-7999-4aa6-be6f-bdfc11c61cf8","Type":"ContainerStarted","Data":"e1ed5eb4e516f4be735785fac8f3f2e20c5ee1c9acfa50fb7f7a2c53e1b37596"} Mar 20 18:04:02 crc kubenswrapper[4795]: I0320 18:04:02.730154 4795 generic.go:334] "Generic (PLEG): container finished" podID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerID="b5132a3bc3cc2a9d3f7f72ee052f22ef62d6486d39d7ff63bd2d4ca8c43eb377" exitCode=0 Mar 20 18:04:02 crc kubenswrapper[4795]: I0320 18:04:02.730356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" event={"ID":"42c32f39-7999-4aa6-be6f-bdfc11c61cf8","Type":"ContainerDied","Data":"b5132a3bc3cc2a9d3f7f72ee052f22ef62d6486d39d7ff63bd2d4ca8c43eb377"} Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.125964 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.192561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") pod \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\" (UID: \"42c32f39-7999-4aa6-be6f-bdfc11c61cf8\") " Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.199974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5" (OuterVolumeSpecName: "kube-api-access-whjk5") pod "42c32f39-7999-4aa6-be6f-bdfc11c61cf8" (UID: "42c32f39-7999-4aa6-be6f-bdfc11c61cf8"). InnerVolumeSpecName "kube-api-access-whjk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.294160 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whjk5\" (UniqueName: \"kubernetes.io/projected/42c32f39-7999-4aa6-be6f-bdfc11c61cf8-kube-api-access-whjk5\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.755603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" event={"ID":"42c32f39-7999-4aa6-be6f-bdfc11c61cf8","Type":"ContainerDied","Data":"e1ed5eb4e516f4be735785fac8f3f2e20c5ee1c9acfa50fb7f7a2c53e1b37596"} Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.755639 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ed5eb4e516f4be735785fac8f3f2e20c5ee1c9acfa50fb7f7a2c53e1b37596" Mar 20 18:04:04 crc kubenswrapper[4795]: I0320 18:04:04.755798 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-j2kpr" Mar 20 18:04:05 crc kubenswrapper[4795]: I0320 18:04:05.230808 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 18:04:05 crc kubenswrapper[4795]: I0320 18:04:05.239160 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-45rzl"] Mar 20 18:04:05 crc kubenswrapper[4795]: I0320 18:04:05.261014 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed" path="/var/lib/kubelet/pods/08fd2b90-3d66-4e64-bbb0-c4eaf75e0aed/volumes" Mar 20 18:04:41 crc kubenswrapper[4795]: I0320 18:04:41.300581 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:04:41 crc kubenswrapper[4795]: I0320 18:04:41.302382 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:04:55 crc kubenswrapper[4795]: I0320 18:04:55.227729 4795 scope.go:117] "RemoveContainer" containerID="62ed101e828326cc7fffe40dc572b4d86ec23a0ff89623436e14e45075fbfa9a" Mar 20 18:05:11 crc kubenswrapper[4795]: I0320 18:05:11.300042 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:05:11 crc kubenswrapper[4795]: I0320 18:05:11.300902 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:05:36 crc kubenswrapper[4795]: I0320 18:05:36.711026 4795 generic.go:334] "Generic (PLEG): container finished" podID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerID="82decdc91bfadfc70aba4b8aca3b29297f575d87a05741abcadb41a86f6b2aa7" exitCode=0 Mar 20 18:05:36 crc kubenswrapper[4795]: I0320 18:05:36.711113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerDied","Data":"82decdc91bfadfc70aba4b8aca3b29297f575d87a05741abcadb41a86f6b2aa7"} Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.195383 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.324850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") pod \"d519d04c-89f1-46b7-8136-1a9596af73ac\" (UID: \"d519d04c-89f1-46b7-8136-1a9596af73ac\") " Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.332051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.332932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8" (OuterVolumeSpecName: "kube-api-access-ndcs8") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "kube-api-access-ndcs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.357649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.366999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory" (OuterVolumeSpecName: "inventory") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.369948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.375110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.382579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d519d04c-89f1-46b7-8136-1a9596af73ac" (UID: "d519d04c-89f1-46b7-8136-1a9596af73ac"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428193 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndcs8\" (UniqueName: \"kubernetes.io/projected/d519d04c-89f1-46b7-8136-1a9596af73ac-kube-api-access-ndcs8\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428419 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428462 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428476 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428490 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428504 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.428517 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d519d04c-89f1-46b7-8136-1a9596af73ac-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.736342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" event={"ID":"d519d04c-89f1-46b7-8136-1a9596af73ac","Type":"ContainerDied","Data":"01bbbaa61b45180cf8109cf1e889d03b1b5c7df633fdfee4f8bb470cfabf15d5"} Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.736747 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01bbbaa61b45180cf8109cf1e889d03b1b5c7df633fdfee4f8bb470cfabf15d5" Mar 20 18:05:38 crc kubenswrapper[4795]: I0320 18:05:38.736428 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.299815 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.302525 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.302774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.304183 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.304474 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb" gracePeriod=600 Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767429 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb" exitCode=0 Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb"} Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5"} Mar 20 18:05:41 crc kubenswrapper[4795]: I0320 18:05:41.767859 4795 scope.go:117] "RemoveContainer" containerID="3c8a78df11b326a5d2a73551ffeb3690ad334d76561f9a18fdee417f3e87487f" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.030670 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:00 crc kubenswrapper[4795]: E0320 18:06:00.039306 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerName="oc" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.039479 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerName="oc" Mar 20 18:06:00 crc kubenswrapper[4795]: E0320 18:06:00.039584 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.039723 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.040043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" containerName="oc" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.040173 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d519d04c-89f1-46b7-8136-1a9596af73ac" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.041867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.054553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.164586 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.166527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.170369 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.171122 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.171303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.174509 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.209727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.209784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.209878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.311387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.311786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"auto-csr-approver-29567166-n9tld\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.311931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.312010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.312087 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.312396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.332285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"certified-operators-cgvrw\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.386764 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.413792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"auto-csr-approver-29567166-n9tld\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.438962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"auto-csr-approver-29567166-n9tld\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.495058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.876383 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:00 crc kubenswrapper[4795]: W0320 18:06:00.879234 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709ba7cf_f8f7_4741_a050_f10234db1ff3.slice/crio-c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec WatchSource:0}: Error finding container c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec: Status 404 returned error can't find the container with id c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.988061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:06:00 crc kubenswrapper[4795]: W0320 18:06:00.990280 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26125bad_5b31_4c3d_901b_758cb842af78.slice/crio-89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083 WatchSource:0}: Error finding container 89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083: Status 404 returned error can't find the container with id 89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083 Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.991854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerStarted","Data":"c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec"} Mar 20 18:06:00 crc kubenswrapper[4795]: I0320 18:06:00.992838 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:06:02 crc kubenswrapper[4795]: I0320 18:06:02.005487 4795 generic.go:334] "Generic (PLEG): container finished" podID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" exitCode=0 Mar 20 18:06:02 crc kubenswrapper[4795]: I0320 18:06:02.005552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d"} Mar 20 18:06:02 crc kubenswrapper[4795]: I0320 18:06:02.008241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-n9tld" event={"ID":"26125bad-5b31-4c3d-901b-758cb842af78","Type":"ContainerStarted","Data":"89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083"} Mar 20 18:06:03 crc kubenswrapper[4795]: I0320 18:06:03.019216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerStarted","Data":"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a"} Mar 20 18:06:03 crc kubenswrapper[4795]: I0320 18:06:03.021705 4795 generic.go:334] "Generic (PLEG): container finished" podID="26125bad-5b31-4c3d-901b-758cb842af78" containerID="388b030bf3690b9a4b1c0fb962fdb64c5a18c01bd7022e209c55231177d61b95" exitCode=0 Mar 20 18:06:03 crc kubenswrapper[4795]: I0320 18:06:03.021798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-n9tld" event={"ID":"26125bad-5b31-4c3d-901b-758cb842af78","Type":"ContainerDied","Data":"388b030bf3690b9a4b1c0fb962fdb64c5a18c01bd7022e209c55231177d61b95"} Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.033799 4795 generic.go:334] "Generic (PLEG): container finished" podID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" exitCode=0 Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.034036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a"} Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.431965 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.593957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") pod \"26125bad-5b31-4c3d-901b-758cb842af78\" (UID: \"26125bad-5b31-4c3d-901b-758cb842af78\") " Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.600758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2" (OuterVolumeSpecName: "kube-api-access-bgbt2") pod "26125bad-5b31-4c3d-901b-758cb842af78" (UID: "26125bad-5b31-4c3d-901b-758cb842af78"). InnerVolumeSpecName "kube-api-access-bgbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:04 crc kubenswrapper[4795]: I0320 18:06:04.696521 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgbt2\" (UniqueName: \"kubernetes.io/projected/26125bad-5b31-4c3d-901b-758cb842af78-kube-api-access-bgbt2\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.047750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-n9tld" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.051235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-n9tld" event={"ID":"26125bad-5b31-4c3d-901b-758cb842af78","Type":"ContainerDied","Data":"89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083"} Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.051427 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89c64460b21a33197d7aff9f7fb079139946cae58782c39f9fa5bb1e3d7e1083" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.055389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerStarted","Data":"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379"} Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.084891 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cgvrw" podStartSLOduration=3.667673141 podStartE2EDuration="6.084870271s" podCreationTimestamp="2026-03-20 18:05:59 +0000 UTC" firstStartedPulling="2026-03-20 18:06:02.008994219 +0000 UTC m=+2905.467025770" lastFinishedPulling="2026-03-20 18:06:04.426191359 +0000 UTC m=+2907.884222900" observedRunningTime="2026-03-20 18:06:05.078137452 +0000 UTC m=+2908.536169043" watchObservedRunningTime="2026-03-20 18:06:05.084870271 +0000 UTC m=+2908.542901812" Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.507413 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:06:05 crc kubenswrapper[4795]: I0320 18:06:05.514124 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-k8xrk"] Mar 20 18:06:07 crc kubenswrapper[4795]: I0320 18:06:07.282271 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01c5ccc-0cea-415d-969d-64f17a21036b" path="/var/lib/kubelet/pods/a01c5ccc-0cea-415d-969d-64f17a21036b/volumes" Mar 20 18:06:10 crc kubenswrapper[4795]: I0320 18:06:10.386967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:10 crc kubenswrapper[4795]: I0320 18:06:10.387477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:10 crc kubenswrapper[4795]: I0320 18:06:10.475124 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:11 crc kubenswrapper[4795]: I0320 18:06:11.189355 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:11 crc kubenswrapper[4795]: I0320 18:06:11.274798 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.145300 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cgvrw" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" containerID="cri-o://ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" gracePeriod=2 Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.706741 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.802164 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") pod \"709ba7cf-f8f7-4741-a050-f10234db1ff3\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.802285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") pod \"709ba7cf-f8f7-4741-a050-f10234db1ff3\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.802413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") pod \"709ba7cf-f8f7-4741-a050-f10234db1ff3\" (UID: \"709ba7cf-f8f7-4741-a050-f10234db1ff3\") " Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.803544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities" (OuterVolumeSpecName: "utilities") pod "709ba7cf-f8f7-4741-a050-f10234db1ff3" (UID: "709ba7cf-f8f7-4741-a050-f10234db1ff3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.809852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj" (OuterVolumeSpecName: "kube-api-access-xbfnj") pod "709ba7cf-f8f7-4741-a050-f10234db1ff3" (UID: "709ba7cf-f8f7-4741-a050-f10234db1ff3"). InnerVolumeSpecName "kube-api-access-xbfnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.851225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "709ba7cf-f8f7-4741-a050-f10234db1ff3" (UID: "709ba7cf-f8f7-4741-a050-f10234db1ff3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.904643 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.904681 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/709ba7cf-f8f7-4741-a050-f10234db1ff3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:13 crc kubenswrapper[4795]: I0320 18:06:13.904701 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbfnj\" (UniqueName: \"kubernetes.io/projected/709ba7cf-f8f7-4741-a050-f10234db1ff3-kube-api-access-xbfnj\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159024 4795 generic.go:334] "Generic (PLEG): container finished" podID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" exitCode=0 Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379"} Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cgvrw" event={"ID":"709ba7cf-f8f7-4741-a050-f10234db1ff3","Type":"ContainerDied","Data":"c8d8fc1c89571c56bda6be901cac9b09bc4690f68e10c0adc1669ccfe50eb5ec"} Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159446 4795 scope.go:117] "RemoveContainer" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.159169 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cgvrw" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.200310 4795 scope.go:117] "RemoveContainer" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.203028 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.214462 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cgvrw"] Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.235394 4795 scope.go:117] "RemoveContainer" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.263819 4795 scope.go:117] "RemoveContainer" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" Mar 20 18:06:14 crc kubenswrapper[4795]: E0320 18:06:14.264297 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379\": container with ID starting with ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379 not found: ID does not exist" containerID="ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264340 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379"} err="failed to get container status \"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379\": rpc error: code = NotFound desc = could not find container \"ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379\": container with ID starting with ffd3971791d62c334fc1d0abffaf3516d233afb56dd0c952ff1b71bc20bc0379 not found: ID does not exist" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264366 4795 scope.go:117] "RemoveContainer" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" Mar 20 18:06:14 crc kubenswrapper[4795]: E0320 18:06:14.264882 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a\": container with ID starting with 8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a not found: ID does not exist" containerID="8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264917 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a"} err="failed to get container status \"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a\": rpc error: code = NotFound desc = could not find container \"8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a\": container with ID starting with 8f973bba24da7cebd92a4e5e318776e9ca5cc94902eb0985fbdd5ad4de07fb5a not found: ID does not exist" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.264943 4795 scope.go:117] "RemoveContainer" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" Mar 20 18:06:14 crc kubenswrapper[4795]: E0320 18:06:14.265530 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d\": container with ID starting with f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d not found: ID does not exist" containerID="f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d" Mar 20 18:06:14 crc kubenswrapper[4795]: I0320 18:06:14.265564 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d"} err="failed to get container status \"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d\": rpc error: code = NotFound desc = could not find container \"f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d\": container with ID starting with f810cd1a41dcb66d06ac7d5edb0a2c0ba26a098dd09da2302ce1c2105fc3b65d not found: ID does not exist" Mar 20 18:06:15 crc kubenswrapper[4795]: I0320 18:06:15.264322 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" path="/var/lib/kubelet/pods/709ba7cf-f8f7-4741-a050-f10234db1ff3/volumes" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.139217 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140383 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-content" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-content" Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140440 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140468 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-utilities" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140481 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="extract-utilities" Mar 20 18:06:35 crc kubenswrapper[4795]: E0320 18:06:35.140530 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26125bad-5b31-4c3d-901b-758cb842af78" containerName="oc" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140543 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26125bad-5b31-4c3d-901b-758cb842af78" containerName="oc" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140910 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="709ba7cf-f8f7-4741-a050-f10234db1ff3" containerName="registry-server" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.140936 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="26125bad-5b31-4c3d-901b-758cb842af78" containerName="oc" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.141900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.146009 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.147151 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.148583 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zgwjr" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.148982 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.167748 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.233899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.233957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.233981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234138 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234368 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.234882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.337598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.338045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.338105 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.338562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.339105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.339189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.345481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.345601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.348003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.360034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.369246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.472028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:06:35 crc kubenswrapper[4795]: I0320 18:06:35.938016 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:06:36 crc kubenswrapper[4795]: I0320 18:06:36.406115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerStarted","Data":"7bdfe7f881d951a74ca8b66b0f91841dff449bc239ef1c2b7c679ee61596377d"} Mar 20 18:06:55 crc kubenswrapper[4795]: I0320 18:06:55.365637 4795 scope.go:117] "RemoveContainer" containerID="476381efe19b63607c82b31ec27f9dd8365ccb9f0dd4f0e61aba026eba68e6a2" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.532121 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.532597 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w88sw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(caaf60a5-8c45-4831-8d26-8cf808f1da7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.533735 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" Mar 20 18:07:10 crc kubenswrapper[4795]: E0320 18:07:10.799530 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" Mar 20 18:07:12 crc kubenswrapper[4795]: I0320 18:07:12.996251 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:12 crc kubenswrapper[4795]: I0320 18:07:12.999753 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.007520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.117489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.117555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.118030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.219929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220653 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.220852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.254723 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"redhat-marketplace-dbgpt\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.324419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.784114 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:13 crc kubenswrapper[4795]: I0320 18:07:13.830999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerStarted","Data":"49fa4e3d814862c4a24af506dc4bf3906d72f9f6d200a77b7c6698cc84d8440d"} Mar 20 18:07:14 crc kubenswrapper[4795]: I0320 18:07:14.841672 4795 generic.go:334] "Generic (PLEG): container finished" podID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" exitCode=0 Mar 20 18:07:14 crc kubenswrapper[4795]: I0320 18:07:14.841810 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d"} Mar 20 18:07:15 crc kubenswrapper[4795]: I0320 18:07:15.871771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerStarted","Data":"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0"} Mar 20 18:07:16 crc kubenswrapper[4795]: I0320 18:07:16.883082 4795 generic.go:334] "Generic (PLEG): container finished" podID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" exitCode=0 Mar 20 18:07:16 crc kubenswrapper[4795]: I0320 18:07:16.883134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0"} Mar 20 18:07:17 crc kubenswrapper[4795]: I0320 18:07:17.896549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerStarted","Data":"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c"} Mar 20 18:07:17 crc kubenswrapper[4795]: I0320 18:07:17.919525 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dbgpt" podStartSLOduration=3.505611374 podStartE2EDuration="5.919505261s" podCreationTimestamp="2026-03-20 18:07:12 +0000 UTC" firstStartedPulling="2026-03-20 18:07:14.843548617 +0000 UTC m=+2978.301580158" lastFinishedPulling="2026-03-20 18:07:17.257442484 +0000 UTC m=+2980.715474045" observedRunningTime="2026-03-20 18:07:17.915860398 +0000 UTC m=+2981.373891949" watchObservedRunningTime="2026-03-20 18:07:17.919505261 +0000 UTC m=+2981.377536822" Mar 20 18:07:22 crc kubenswrapper[4795]: I0320 18:07:22.730085 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.325233 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.325581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.374296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.955540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerStarted","Data":"ca1e86805a9f6b3f6807f721075e3f792e3f51254780ac13719c7eec007f4373"} Mar 20 18:07:23 crc kubenswrapper[4795]: I0320 18:07:23.987837 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.195969788 podStartE2EDuration="49.987814711s" podCreationTimestamp="2026-03-20 18:06:34 +0000 UTC" firstStartedPulling="2026-03-20 18:06:35.934071526 +0000 UTC m=+2939.392103067" lastFinishedPulling="2026-03-20 18:07:22.725916439 +0000 UTC m=+2986.183947990" observedRunningTime="2026-03-20 18:07:23.978262814 +0000 UTC m=+2987.436294365" watchObservedRunningTime="2026-03-20 18:07:23.987814711 +0000 UTC m=+2987.445846262" Mar 20 18:07:24 crc kubenswrapper[4795]: I0320 18:07:24.033520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:24 crc kubenswrapper[4795]: I0320 18:07:24.093506 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:25 crc kubenswrapper[4795]: I0320 18:07:25.988639 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dbgpt" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" containerID="cri-o://623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" gracePeriod=2 Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.456353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.607206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") pod \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.607346 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") pod \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.607393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") pod \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\" (UID: \"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7\") " Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.608795 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities" (OuterVolumeSpecName: "utilities") pod "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" (UID: "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.614236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb" (OuterVolumeSpecName: "kube-api-access-hszrb") pod "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" (UID: "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7"). InnerVolumeSpecName "kube-api-access-hszrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.709682 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hszrb\" (UniqueName: \"kubernetes.io/projected/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-kube-api-access-hszrb\") on node \"crc\" DevicePath \"\"" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.709736 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.753597 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" (UID: "95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:07:26 crc kubenswrapper[4795]: I0320 18:07:26.811824 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.009657 4795 generic.go:334] "Generic (PLEG): container finished" podID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" exitCode=0 Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.009743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c"} Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.010227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dbgpt" event={"ID":"95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7","Type":"ContainerDied","Data":"49fa4e3d814862c4a24af506dc4bf3906d72f9f6d200a77b7c6698cc84d8440d"} Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.009778 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dbgpt" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.010272 4795 scope.go:117] "RemoveContainer" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.042203 4795 scope.go:117] "RemoveContainer" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.062436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.074188 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dbgpt"] Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.086370 4795 scope.go:117] "RemoveContainer" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.125671 4795 scope.go:117] "RemoveContainer" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" Mar 20 18:07:27 crc kubenswrapper[4795]: E0320 18:07:27.126753 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c\": container with ID starting with 623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c not found: ID does not exist" containerID="623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.126793 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c"} err="failed to get container status \"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c\": rpc error: code = NotFound desc = could not find container \"623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c\": container with ID starting with 623bbd79a6c560d6a589af40dcd4c9833e4061cfa3deaffdb4fe0296eb3aa29c not found: ID does not exist" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.126816 4795 scope.go:117] "RemoveContainer" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" Mar 20 18:07:27 crc kubenswrapper[4795]: E0320 18:07:27.127588 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0\": container with ID starting with ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0 not found: ID does not exist" containerID="ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.127616 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0"} err="failed to get container status \"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0\": rpc error: code = NotFound desc = could not find container \"ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0\": container with ID starting with ce34f41a8edebe6d46960ad694ac00d1005aa75227c68c8b9eb12ce6ec47e9d0 not found: ID does not exist" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.127636 4795 scope.go:117] "RemoveContainer" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" Mar 20 18:07:27 crc kubenswrapper[4795]: E0320 18:07:27.128037 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d\": container with ID starting with 62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d not found: ID does not exist" containerID="62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.128061 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d"} err="failed to get container status \"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d\": rpc error: code = NotFound desc = could not find container \"62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d\": container with ID starting with 62531065b97297a2618689e8c915d43d7630541862a57218594d570e25918f8d not found: ID does not exist" Mar 20 18:07:27 crc kubenswrapper[4795]: I0320 18:07:27.263985 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" path="/var/lib/kubelet/pods/95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7/volumes" Mar 20 18:07:41 crc kubenswrapper[4795]: I0320 18:07:41.299854 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:07:41 crc kubenswrapper[4795]: I0320 18:07:41.300582 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.146417 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:08:00 crc kubenswrapper[4795]: E0320 18:08:00.147185 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-content" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147198 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-content" Mar 20 18:08:00 crc kubenswrapper[4795]: E0320 18:08:00.147207 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-utilities" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147213 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="extract-utilities" Mar 20 18:08:00 crc kubenswrapper[4795]: E0320 18:08:00.147228 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147234 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147412 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f2cc7d-b289-4df6-b6ba-578ffb1e5cb7" containerName="registry-server" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.147966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.155200 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.155314 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.156129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.167461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.226827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"auto-csr-approver-29567168-m7dd2\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.328731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"auto-csr-approver-29567168-m7dd2\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.352272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"auto-csr-approver-29567168-m7dd2\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:00 crc kubenswrapper[4795]: I0320 18:08:00.483638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:01 crc kubenswrapper[4795]: I0320 18:08:01.030066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:08:01 crc kubenswrapper[4795]: I0320 18:08:01.386856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerStarted","Data":"5a4331e63748a9f9682d25e28b01e564fd5479e87b7c2fd8660f5fc91650c4f0"} Mar 20 18:08:02 crc kubenswrapper[4795]: I0320 18:08:02.399053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerStarted","Data":"bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f"} Mar 20 18:08:02 crc kubenswrapper[4795]: I0320 18:08:02.418381 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" podStartSLOduration=1.518303001 podStartE2EDuration="2.418366195s" podCreationTimestamp="2026-03-20 18:08:00 +0000 UTC" firstStartedPulling="2026-03-20 18:08:01.025501274 +0000 UTC m=+3024.483532845" lastFinishedPulling="2026-03-20 18:08:01.925564488 +0000 UTC m=+3025.383596039" observedRunningTime="2026-03-20 18:08:02.414850446 +0000 UTC m=+3025.872881987" watchObservedRunningTime="2026-03-20 18:08:02.418366195 +0000 UTC m=+3025.876397726" Mar 20 18:08:03 crc kubenswrapper[4795]: I0320 18:08:03.410345 4795 generic.go:334] "Generic (PLEG): container finished" podID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerID="bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f" exitCode=0 Mar 20 18:08:03 crc kubenswrapper[4795]: I0320 18:08:03.410545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerDied","Data":"bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f"} Mar 20 18:08:04 crc kubenswrapper[4795]: I0320 18:08:04.893967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.021574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") pod \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\" (UID: \"47914a4d-df4b-443d-b7f4-b30bfe9e7a98\") " Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.027787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h" (OuterVolumeSpecName: "kube-api-access-qxm6h") pod "47914a4d-df4b-443d-b7f4-b30bfe9e7a98" (UID: "47914a4d-df4b-443d-b7f4-b30bfe9e7a98"). InnerVolumeSpecName "kube-api-access-qxm6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.123508 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxm6h\" (UniqueName: \"kubernetes.io/projected/47914a4d-df4b-443d-b7f4-b30bfe9e7a98-kube-api-access-qxm6h\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.429984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" event={"ID":"47914a4d-df4b-443d-b7f4-b30bfe9e7a98","Type":"ContainerDied","Data":"5a4331e63748a9f9682d25e28b01e564fd5479e87b7c2fd8660f5fc91650c4f0"} Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.430020 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a4331e63748a9f9682d25e28b01e564fd5479e87b7c2fd8660f5fc91650c4f0" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.430028 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-m7dd2" Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.488565 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:08:05 crc kubenswrapper[4795]: I0320 18:08:05.494998 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-729s4"] Mar 20 18:08:07 crc kubenswrapper[4795]: I0320 18:08:07.263041 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f28413f-4baf-4c13-bfaa-dc76fcb80e65" path="/var/lib/kubelet/pods/3f28413f-4baf-4c13-bfaa-dc76fcb80e65/volumes" Mar 20 18:08:10 crc kubenswrapper[4795]: I0320 18:08:10.501893 4795 scope.go:117] "RemoveContainer" containerID="d974ca7e16de9404ab80c37c963c39d73239baae8fb9a3246b8e5e345f171158" Mar 20 18:08:11 crc kubenswrapper[4795]: I0320 18:08:11.300042 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:08:11 crc kubenswrapper[4795]: I0320 18:08:11.300980 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.300315 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301040 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301103 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301919 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.301984 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" gracePeriod=600 Mar 20 18:08:41 crc kubenswrapper[4795]: E0320 18:08:41.445349 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797002 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" exitCode=0 Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5"} Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797093 4795 scope.go:117] "RemoveContainer" containerID="4afb4d750c817e30eff9ecfb24ad3e9234b6f2b728f9764146e9238c929ec5cb" Mar 20 18:08:41 crc kubenswrapper[4795]: I0320 18:08:41.797631 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:08:41 crc kubenswrapper[4795]: E0320 18:08:41.798037 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:08:56 crc kubenswrapper[4795]: I0320 18:08:56.252797 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:08:56 crc kubenswrapper[4795]: E0320 18:08:56.254101 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:08 crc kubenswrapper[4795]: I0320 18:09:08.252659 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:08 crc kubenswrapper[4795]: E0320 18:09:08.253930 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:20 crc kubenswrapper[4795]: I0320 18:09:20.251924 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:20 crc kubenswrapper[4795]: E0320 18:09:20.252943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:33 crc kubenswrapper[4795]: I0320 18:09:33.251930 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:33 crc kubenswrapper[4795]: E0320 18:09:33.253871 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:47 crc kubenswrapper[4795]: I0320 18:09:47.264540 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:09:47 crc kubenswrapper[4795]: E0320 18:09:47.265327 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.949416 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:09:49 crc kubenswrapper[4795]: E0320 18:09:49.950524 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerName="oc" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.950552 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerName="oc" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.951060 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" containerName="oc" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.953864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:49 crc kubenswrapper[4795]: I0320 18:09:49.960192 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.032416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.032490 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.032658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.133904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134946 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.134970 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.162864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"community-operators-njbls\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.290210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:09:50 crc kubenswrapper[4795]: I0320 18:09:50.904100 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.499237 4795 generic.go:334] "Generic (PLEG): container finished" podID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" exitCode=0 Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.499280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c"} Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.499316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerStarted","Data":"d3d3bfd1cfcecc4d9898b8b0e44fe867b312105b71e72d952b6aa6c41b752b2a"} Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.736281 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.738998 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.753547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.892316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.892375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.892528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.994092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.994155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.994230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.995234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:51 crc kubenswrapper[4795]: I0320 18:09:51.995517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:52 crc kubenswrapper[4795]: I0320 18:09:52.020625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"redhat-operators-ldmj6\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:52 crc kubenswrapper[4795]: I0320 18:09:52.080238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:09:52 crc kubenswrapper[4795]: I0320 18:09:52.630386 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:09:52 crc kubenswrapper[4795]: W0320 18:09:52.641847 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd2f5e2_4399_4d44_9266_a6221ff1548c.slice/crio-8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670 WatchSource:0}: Error finding container 8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670: Status 404 returned error can't find the container with id 8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670 Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.516982 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerID="277d3cb4b0e2d6af98c3f5ab3c45418763eb9ab970b847eeb8b262c597a914dd" exitCode=0 Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.517058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"277d3cb4b0e2d6af98c3f5ab3c45418763eb9ab970b847eeb8b262c597a914dd"} Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.517381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerStarted","Data":"8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670"} Mar 20 18:09:53 crc kubenswrapper[4795]: I0320 18:09:53.519288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerStarted","Data":"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594"} Mar 20 18:09:55 crc kubenswrapper[4795]: I0320 18:09:55.544831 4795 generic.go:334] "Generic (PLEG): container finished" podID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" exitCode=0 Mar 20 18:09:55 crc kubenswrapper[4795]: I0320 18:09:55.544983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594"} Mar 20 18:09:55 crc kubenswrapper[4795]: I0320 18:09:55.550598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerStarted","Data":"0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f"} Mar 20 18:09:56 crc kubenswrapper[4795]: I0320 18:09:56.566169 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerID="0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f" exitCode=0 Mar 20 18:09:56 crc kubenswrapper[4795]: I0320 18:09:56.566295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f"} Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.578961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerStarted","Data":"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238"} Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.581662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerStarted","Data":"18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d"} Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.607260 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-njbls" podStartSLOduration=3.766932521 podStartE2EDuration="8.607241585s" podCreationTimestamp="2026-03-20 18:09:49 +0000 UTC" firstStartedPulling="2026-03-20 18:09:51.501355025 +0000 UTC m=+3134.959386566" lastFinishedPulling="2026-03-20 18:09:56.341664089 +0000 UTC m=+3139.799695630" observedRunningTime="2026-03-20 18:09:57.598490313 +0000 UTC m=+3141.056521864" watchObservedRunningTime="2026-03-20 18:09:57.607241585 +0000 UTC m=+3141.065273136" Mar 20 18:09:57 crc kubenswrapper[4795]: I0320 18:09:57.625539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ldmj6" podStartSLOduration=3.07476179 podStartE2EDuration="6.625520502s" podCreationTimestamp="2026-03-20 18:09:51 +0000 UTC" firstStartedPulling="2026-03-20 18:09:53.518826851 +0000 UTC m=+3136.976858392" lastFinishedPulling="2026-03-20 18:09:57.069585563 +0000 UTC m=+3140.527617104" observedRunningTime="2026-03-20 18:09:57.623205421 +0000 UTC m=+3141.081236972" watchObservedRunningTime="2026-03-20 18:09:57.625520502 +0000 UTC m=+3141.083552053" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.154713 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.156370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.179821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.180129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.180368 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.189889 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.273609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"auto-csr-approver-29567170-nh84v\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.291293 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.291339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.376183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"auto-csr-approver-29567170-nh84v\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.415090 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"auto-csr-approver-29567170-nh84v\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.488910 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:00 crc kubenswrapper[4795]: I0320 18:10:00.988958 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:10:00 crc kubenswrapper[4795]: W0320 18:10:00.989337 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode37c7446_ee1e_4fba_b0ff_4b0002aa14b4.slice/crio-95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f WatchSource:0}: Error finding container 95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f: Status 404 returned error can't find the container with id 95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f Mar 20 18:10:01 crc kubenswrapper[4795]: I0320 18:10:01.345766 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-njbls" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" probeResult="failure" output=< Mar 20 18:10:01 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:10:01 crc kubenswrapper[4795]: > Mar 20 18:10:01 crc kubenswrapper[4795]: I0320 18:10:01.654207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerStarted","Data":"95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f"} Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.081202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.081249 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.251939 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:02 crc kubenswrapper[4795]: E0320 18:10:02.252179 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.665522 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerStarted","Data":"678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c"} Mar 20 18:10:02 crc kubenswrapper[4795]: I0320 18:10:02.681566 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567170-nh84v" podStartSLOduration=1.492567885 podStartE2EDuration="2.681546479s" podCreationTimestamp="2026-03-20 18:10:00 +0000 UTC" firstStartedPulling="2026-03-20 18:10:00.992358848 +0000 UTC m=+3144.450390389" lastFinishedPulling="2026-03-20 18:10:02.181337442 +0000 UTC m=+3145.639368983" observedRunningTime="2026-03-20 18:10:02.678514146 +0000 UTC m=+3146.136545697" watchObservedRunningTime="2026-03-20 18:10:02.681546479 +0000 UTC m=+3146.139578020" Mar 20 18:10:03 crc kubenswrapper[4795]: I0320 18:10:03.151553 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ldmj6" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" probeResult="failure" output=< Mar 20 18:10:03 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:10:03 crc kubenswrapper[4795]: > Mar 20 18:10:03 crc kubenswrapper[4795]: I0320 18:10:03.676304 4795 generic.go:334] "Generic (PLEG): container finished" podID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerID="678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c" exitCode=0 Mar 20 18:10:03 crc kubenswrapper[4795]: I0320 18:10:03.676352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerDied","Data":"678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c"} Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.253254 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.371899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") pod \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\" (UID: \"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4\") " Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.382984 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b" (OuterVolumeSpecName: "kube-api-access-28n6b") pod "e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" (UID: "e37c7446-ee1e-4fba-b0ff-4b0002aa14b4"). InnerVolumeSpecName "kube-api-access-28n6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.474750 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28n6b\" (UniqueName: \"kubernetes.io/projected/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4-kube-api-access-28n6b\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.693421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-nh84v" event={"ID":"e37c7446-ee1e-4fba-b0ff-4b0002aa14b4","Type":"ContainerDied","Data":"95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f"} Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.693731 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95886bb3c8f7205f07c9f02b64e8a41865b2c4d66c986bce4db689c92481cb8f" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.693506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-nh84v" Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.753272 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:10:05 crc kubenswrapper[4795]: I0320 18:10:05.760961 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-j2kpr"] Mar 20 18:10:07 crc kubenswrapper[4795]: I0320 18:10:07.266419 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c32f39-7999-4aa6-be6f-bdfc11c61cf8" path="/var/lib/kubelet/pods/42c32f39-7999-4aa6-be6f-bdfc11c61cf8/volumes" Mar 20 18:10:10 crc kubenswrapper[4795]: I0320 18:10:10.349855 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:10 crc kubenswrapper[4795]: I0320 18:10:10.401589 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:10 crc kubenswrapper[4795]: I0320 18:10:10.677526 4795 scope.go:117] "RemoveContainer" containerID="b5132a3bc3cc2a9d3f7f72ee052f22ef62d6486d39d7ff63bd2d4ca8c43eb377" Mar 20 18:10:11 crc kubenswrapper[4795]: I0320 18:10:11.084212 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:10:11 crc kubenswrapper[4795]: I0320 18:10:11.755121 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-njbls" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" containerID="cri-o://3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" gracePeriod=2 Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.145465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.205435 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.414145 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.540019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") pod \"d60e8097-033f-426d-b3d1-c6837d4e6231\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.540138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") pod \"d60e8097-033f-426d-b3d1-c6837d4e6231\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.540226 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") pod \"d60e8097-033f-426d-b3d1-c6837d4e6231\" (UID: \"d60e8097-033f-426d-b3d1-c6837d4e6231\") " Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.541490 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities" (OuterVolumeSpecName: "utilities") pod "d60e8097-033f-426d-b3d1-c6837d4e6231" (UID: "d60e8097-033f-426d-b3d1-c6837d4e6231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.552057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms" (OuterVolumeSpecName: "kube-api-access-t6hms") pod "d60e8097-033f-426d-b3d1-c6837d4e6231" (UID: "d60e8097-033f-426d-b3d1-c6837d4e6231"). InnerVolumeSpecName "kube-api-access-t6hms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.616640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d60e8097-033f-426d-b3d1-c6837d4e6231" (UID: "d60e8097-033f-426d-b3d1-c6837d4e6231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.643183 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6hms\" (UniqueName: \"kubernetes.io/projected/d60e8097-033f-426d-b3d1-c6837d4e6231-kube-api-access-t6hms\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.643222 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.643236 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60e8097-033f-426d-b3d1-c6837d4e6231-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.769109 4795 generic.go:334] "Generic (PLEG): container finished" podID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" exitCode=0 Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.769932 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-njbls" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.770162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238"} Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.770224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-njbls" event={"ID":"d60e8097-033f-426d-b3d1-c6837d4e6231","Type":"ContainerDied","Data":"d3d3bfd1cfcecc4d9898b8b0e44fe867b312105b71e72d952b6aa6c41b752b2a"} Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.770254 4795 scope.go:117] "RemoveContainer" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.813438 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.822031 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-njbls"] Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.832022 4795 scope.go:117] "RemoveContainer" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.863720 4795 scope.go:117] "RemoveContainer" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928082 4795 scope.go:117] "RemoveContainer" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" Mar 20 18:10:12 crc kubenswrapper[4795]: E0320 18:10:12.928420 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238\": container with ID starting with 3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238 not found: ID does not exist" containerID="3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928605 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238"} err="failed to get container status \"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238\": rpc error: code = NotFound desc = could not find container \"3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238\": container with ID starting with 3a1fbcd9ccc20a732a86a70b9e4e9e5c51cf24c4629105a78515fb4fb6c75238 not found: ID does not exist" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928629 4795 scope.go:117] "RemoveContainer" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" Mar 20 18:10:12 crc kubenswrapper[4795]: E0320 18:10:12.928975 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594\": container with ID starting with 7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594 not found: ID does not exist" containerID="7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.928999 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594"} err="failed to get container status \"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594\": rpc error: code = NotFound desc = could not find container \"7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594\": container with ID starting with 7d7e34dad65f040a96203e811404572b0d3a48fe26fe8ab0b73f1090a2a16594 not found: ID does not exist" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.929017 4795 scope.go:117] "RemoveContainer" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" Mar 20 18:10:12 crc kubenswrapper[4795]: E0320 18:10:12.929273 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c\": container with ID starting with 9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c not found: ID does not exist" containerID="9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c" Mar 20 18:10:12 crc kubenswrapper[4795]: I0320 18:10:12.929295 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c"} err="failed to get container status \"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c\": rpc error: code = NotFound desc = could not find container \"9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c\": container with ID starting with 9bed059c1f0f7e0dde508cf2ec7634acfd81e0c790a7b6274dd8ec2e3cf4821c not found: ID does not exist" Mar 20 18:10:13 crc kubenswrapper[4795]: I0320 18:10:13.265818 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" path="/var/lib/kubelet/pods/d60e8097-033f-426d-b3d1-c6837d4e6231/volumes" Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.482382 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.482875 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ldmj6" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" containerID="cri-o://18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d" gracePeriod=2 Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.793055 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerID="18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d" exitCode=0 Mar 20 18:10:14 crc kubenswrapper[4795]: I0320 18:10:14.793134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d"} Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.160432 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.291225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") pod \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.291294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") pod \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.291320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") pod \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\" (UID: \"fdd2f5e2-4399-4d44-9266-a6221ff1548c\") " Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.294471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities" (OuterVolumeSpecName: "utilities") pod "fdd2f5e2-4399-4d44-9266-a6221ff1548c" (UID: "fdd2f5e2-4399-4d44-9266-a6221ff1548c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.300405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg" (OuterVolumeSpecName: "kube-api-access-q55qg") pod "fdd2f5e2-4399-4d44-9266-a6221ff1548c" (UID: "fdd2f5e2-4399-4d44-9266-a6221ff1548c"). InnerVolumeSpecName "kube-api-access-q55qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.393255 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.393287 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q55qg\" (UniqueName: \"kubernetes.io/projected/fdd2f5e2-4399-4d44-9266-a6221ff1548c-kube-api-access-q55qg\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.442533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdd2f5e2-4399-4d44-9266-a6221ff1548c" (UID: "fdd2f5e2-4399-4d44-9266-a6221ff1548c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.494410 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2f5e2-4399-4d44-9266-a6221ff1548c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.803634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldmj6" event={"ID":"fdd2f5e2-4399-4d44-9266-a6221ff1548c","Type":"ContainerDied","Data":"8ff5c9d939ea79173148d650042c34b46a99524d892c547cc67c59b58f443670"} Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.803716 4795 scope.go:117] "RemoveContainer" containerID="18c4e7fb6f51342d6b359aa691f6b4161ae6d5ef881b7ac6fe20572c49c8552d" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.803856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldmj6" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.836633 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.837997 4795 scope.go:117] "RemoveContainer" containerID="0321d4359ce799a59310c35a21c76695821cab3f794b5aecb8e10422c41bff1f" Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.846862 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ldmj6"] Mar 20 18:10:15 crc kubenswrapper[4795]: I0320 18:10:15.860356 4795 scope.go:117] "RemoveContainer" containerID="277d3cb4b0e2d6af98c3f5ab3c45418763eb9ab970b847eeb8b262c597a914dd" Mar 20 18:10:16 crc kubenswrapper[4795]: I0320 18:10:16.252729 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:16 crc kubenswrapper[4795]: E0320 18:10:16.252991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:17 crc kubenswrapper[4795]: I0320 18:10:17.269380 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" path="/var/lib/kubelet/pods/fdd2f5e2-4399-4d44-9266-a6221ff1548c/volumes" Mar 20 18:10:28 crc kubenswrapper[4795]: I0320 18:10:28.252459 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:28 crc kubenswrapper[4795]: E0320 18:10:28.254473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:40 crc kubenswrapper[4795]: I0320 18:10:40.252626 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:40 crc kubenswrapper[4795]: E0320 18:10:40.253376 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:10:52 crc kubenswrapper[4795]: I0320 18:10:52.252495 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:10:52 crc kubenswrapper[4795]: E0320 18:10:52.253440 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:03 crc kubenswrapper[4795]: I0320 18:11:03.254064 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:03 crc kubenswrapper[4795]: E0320 18:11:03.255502 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:18 crc kubenswrapper[4795]: I0320 18:11:18.252652 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:18 crc kubenswrapper[4795]: E0320 18:11:18.253515 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:33 crc kubenswrapper[4795]: I0320 18:11:33.255592 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:33 crc kubenswrapper[4795]: E0320 18:11:33.256490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:45 crc kubenswrapper[4795]: I0320 18:11:45.252224 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:45 crc kubenswrapper[4795]: E0320 18:11:45.253059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:11:58 crc kubenswrapper[4795]: I0320 18:11:58.252716 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:11:58 crc kubenswrapper[4795]: E0320 18:11:58.254529 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.155463 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156550 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156580 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156601 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156614 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156647 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerName="oc" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerName="oc" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156799 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156811 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4795]: E0320 18:12:00.156865 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.156879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.157186 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" containerName="oc" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.157235 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd2f5e2-4399-4d44-9266-a6221ff1548c" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.157262 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60e8097-033f-426d-b3d1-c6837d4e6231" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.158363 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.164260 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.165004 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.165359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.170182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.261396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"auto-csr-approver-29567172-nmn82\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.363026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"auto-csr-approver-29567172-nmn82\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.394341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"auto-csr-approver-29567172-nmn82\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:00 crc kubenswrapper[4795]: I0320 18:12:00.492050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:01 crc kubenswrapper[4795]: I0320 18:12:00.997731 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:12:01 crc kubenswrapper[4795]: I0320 18:12:01.006537 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:12:01 crc kubenswrapper[4795]: I0320 18:12:01.847249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-nmn82" event={"ID":"c9c43e03-d618-403e-874e-ff8337f97372","Type":"ContainerStarted","Data":"18f23700233c7b3587879479e84decfac3907e8a37c00216b2ba53a003b9b7bc"} Mar 20 18:12:02 crc kubenswrapper[4795]: I0320 18:12:02.858981 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c43e03-d618-403e-874e-ff8337f97372" containerID="09c9bfb67ac08b2ff7e4b90082be2f738cf4aa4c4b3eb977b33ed1ee55c790cb" exitCode=0 Mar 20 18:12:02 crc kubenswrapper[4795]: I0320 18:12:02.859059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-nmn82" event={"ID":"c9c43e03-d618-403e-874e-ff8337f97372","Type":"ContainerDied","Data":"09c9bfb67ac08b2ff7e4b90082be2f738cf4aa4c4b3eb977b33ed1ee55c790cb"} Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.351045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.440063 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") pod \"c9c43e03-d618-403e-874e-ff8337f97372\" (UID: \"c9c43e03-d618-403e-874e-ff8337f97372\") " Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.444961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n" (OuterVolumeSpecName: "kube-api-access-7w74n") pod "c9c43e03-d618-403e-874e-ff8337f97372" (UID: "c9c43e03-d618-403e-874e-ff8337f97372"). InnerVolumeSpecName "kube-api-access-7w74n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.541960 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w74n\" (UniqueName: \"kubernetes.io/projected/c9c43e03-d618-403e-874e-ff8337f97372-kube-api-access-7w74n\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.884084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-nmn82" event={"ID":"c9c43e03-d618-403e-874e-ff8337f97372","Type":"ContainerDied","Data":"18f23700233c7b3587879479e84decfac3907e8a37c00216b2ba53a003b9b7bc"} Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.884146 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f23700233c7b3587879479e84decfac3907e8a37c00216b2ba53a003b9b7bc" Mar 20 18:12:04 crc kubenswrapper[4795]: I0320 18:12:04.884160 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-nmn82" Mar 20 18:12:05 crc kubenswrapper[4795]: I0320 18:12:05.443743 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:12:05 crc kubenswrapper[4795]: I0320 18:12:05.455412 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-n9tld"] Mar 20 18:12:07 crc kubenswrapper[4795]: I0320 18:12:07.263511 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26125bad-5b31-4c3d-901b-758cb842af78" path="/var/lib/kubelet/pods/26125bad-5b31-4c3d-901b-758cb842af78/volumes" Mar 20 18:12:10 crc kubenswrapper[4795]: I0320 18:12:10.831350 4795 scope.go:117] "RemoveContainer" containerID="388b030bf3690b9a4b1c0fb962fdb64c5a18c01bd7022e209c55231177d61b95" Mar 20 18:12:12 crc kubenswrapper[4795]: I0320 18:12:12.253208 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:12 crc kubenswrapper[4795]: E0320 18:12:12.253970 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:27 crc kubenswrapper[4795]: I0320 18:12:27.265044 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:27 crc kubenswrapper[4795]: E0320 18:12:27.266249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:41 crc kubenswrapper[4795]: I0320 18:12:41.253556 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:41 crc kubenswrapper[4795]: E0320 18:12:41.254576 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:54 crc kubenswrapper[4795]: I0320 18:12:54.252013 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:12:54 crc kubenswrapper[4795]: E0320 18:12:54.252919 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:12:55 crc kubenswrapper[4795]: I0320 18:12:55.679337 4795 patch_prober.go:28] interesting pod/router-default-5444994796-lrxrs container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 18:12:55 crc kubenswrapper[4795]: I0320 18:12:55.679742 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-lrxrs" podUID="067aa008-8dda-4bfe-bfd2-388abdb54299" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 18:13:08 crc kubenswrapper[4795]: I0320 18:13:08.251701 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:08 crc kubenswrapper[4795]: E0320 18:13:08.252477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:13:23 crc kubenswrapper[4795]: I0320 18:13:23.252758 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:23 crc kubenswrapper[4795]: E0320 18:13:23.253503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:13:36 crc kubenswrapper[4795]: I0320 18:13:36.252294 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:36 crc kubenswrapper[4795]: E0320 18:13:36.252975 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:13:51 crc kubenswrapper[4795]: I0320 18:13:51.259584 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:13:52 crc kubenswrapper[4795]: I0320 18:13:52.318709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d"} Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.189187 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:14:00 crc kubenswrapper[4795]: E0320 18:14:00.190083 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c43e03-d618-403e-874e-ff8337f97372" containerName="oc" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.190095 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c43e03-d618-403e-874e-ff8337f97372" containerName="oc" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.190280 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c43e03-d618-403e-874e-ff8337f97372" containerName="oc" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.190929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.195311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.195503 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.195652 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.214643 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.342091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"auto-csr-approver-29567174-8lkms\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.444393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"auto-csr-approver-29567174-8lkms\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.464680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"auto-csr-approver-29567174-8lkms\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:00 crc kubenswrapper[4795]: I0320 18:14:00.517939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:01 crc kubenswrapper[4795]: I0320 18:14:01.000715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:14:01 crc kubenswrapper[4795]: W0320 18:14:01.008784 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd290fd42_4040_428e_8af1_8091250112e7.slice/crio-567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f WatchSource:0}: Error finding container 567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f: Status 404 returned error can't find the container with id 567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f Mar 20 18:14:01 crc kubenswrapper[4795]: I0320 18:14:01.423439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerStarted","Data":"567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f"} Mar 20 18:14:02 crc kubenswrapper[4795]: I0320 18:14:02.460294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerStarted","Data":"6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1"} Mar 20 18:14:02 crc kubenswrapper[4795]: I0320 18:14:02.484615 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567174-8lkms" podStartSLOduration=1.5877354769999998 podStartE2EDuration="2.484595445s" podCreationTimestamp="2026-03-20 18:14:00 +0000 UTC" firstStartedPulling="2026-03-20 18:14:01.011333266 +0000 UTC m=+3384.469364817" lastFinishedPulling="2026-03-20 18:14:01.908193234 +0000 UTC m=+3385.366224785" observedRunningTime="2026-03-20 18:14:02.480270201 +0000 UTC m=+3385.938301772" watchObservedRunningTime="2026-03-20 18:14:02.484595445 +0000 UTC m=+3385.942626996" Mar 20 18:14:03 crc kubenswrapper[4795]: I0320 18:14:03.472910 4795 generic.go:334] "Generic (PLEG): container finished" podID="d290fd42-4040-428e-8af1-8091250112e7" containerID="6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1" exitCode=0 Mar 20 18:14:03 crc kubenswrapper[4795]: I0320 18:14:03.473043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerDied","Data":"6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1"} Mar 20 18:14:04 crc kubenswrapper[4795]: I0320 18:14:04.961703 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.033534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") pod \"d290fd42-4040-428e-8af1-8091250112e7\" (UID: \"d290fd42-4040-428e-8af1-8091250112e7\") " Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.056900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh" (OuterVolumeSpecName: "kube-api-access-cz6fh") pod "d290fd42-4040-428e-8af1-8091250112e7" (UID: "d290fd42-4040-428e-8af1-8091250112e7"). InnerVolumeSpecName "kube-api-access-cz6fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.136180 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz6fh\" (UniqueName: \"kubernetes.io/projected/d290fd42-4040-428e-8af1-8091250112e7-kube-api-access-cz6fh\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.497887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-8lkms" event={"ID":"d290fd42-4040-428e-8af1-8091250112e7","Type":"ContainerDied","Data":"567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f"} Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.498080 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="567e1e3025caa211725ecf9270c85193612fecd099c86cd962af3a874ca08c5f" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.497968 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-8lkms" Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.572327 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:14:05 crc kubenswrapper[4795]: I0320 18:14:05.580619 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-m7dd2"] Mar 20 18:14:07 crc kubenswrapper[4795]: I0320 18:14:07.264411 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47914a4d-df4b-443d-b7f4-b30bfe9e7a98" path="/var/lib/kubelet/pods/47914a4d-df4b-443d-b7f4-b30bfe9e7a98/volumes" Mar 20 18:14:10 crc kubenswrapper[4795]: I0320 18:14:10.973450 4795 scope.go:117] "RemoveContainer" containerID="bf867a6e7d6c5ea2dcb1d75b1399a5354375f53c845fe9a30c34fd725a0eba5f" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.168827 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd"] Mar 20 18:15:00 crc kubenswrapper[4795]: E0320 18:15:00.177334 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d290fd42-4040-428e-8af1-8091250112e7" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.177364 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d290fd42-4040-428e-8af1-8091250112e7" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.177560 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d290fd42-4040-428e-8af1-8091250112e7" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.178138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd"] Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.178219 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.183096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.183227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.304172 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.304561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.304710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.406140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.406217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.406290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.407254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.413801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.439792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"collect-profiles-29567175-bdtnd\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.514281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:00 crc kubenswrapper[4795]: I0320 18:15:00.966358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd"] Mar 20 18:15:01 crc kubenswrapper[4795]: I0320 18:15:01.065110 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" event={"ID":"6419bd80-fcd7-4b70-bb03-c8a97aa4da93","Type":"ContainerStarted","Data":"8148282fa276f46ca897b62036478449efe04e103762f682938e37d456ce8f61"} Mar 20 18:15:02 crc kubenswrapper[4795]: I0320 18:15:02.079096 4795 generic.go:334] "Generic (PLEG): container finished" podID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerID="f4bbde7b605a50344cd144b9443257804c13dc6048d563bf2b8b2bd90820631f" exitCode=0 Mar 20 18:15:02 crc kubenswrapper[4795]: I0320 18:15:02.079407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" event={"ID":"6419bd80-fcd7-4b70-bb03-c8a97aa4da93","Type":"ContainerDied","Data":"f4bbde7b605a50344cd144b9443257804c13dc6048d563bf2b8b2bd90820631f"} Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.676196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.878591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") pod \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.878822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") pod \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.878858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") pod \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\" (UID: \"6419bd80-fcd7-4b70-bb03-c8a97aa4da93\") " Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.879437 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume" (OuterVolumeSpecName: "config-volume") pod "6419bd80-fcd7-4b70-bb03-c8a97aa4da93" (UID: "6419bd80-fcd7-4b70-bb03-c8a97aa4da93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.884799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9" (OuterVolumeSpecName: "kube-api-access-27jq9") pod "6419bd80-fcd7-4b70-bb03-c8a97aa4da93" (UID: "6419bd80-fcd7-4b70-bb03-c8a97aa4da93"). InnerVolumeSpecName "kube-api-access-27jq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.884969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6419bd80-fcd7-4b70-bb03-c8a97aa4da93" (UID: "6419bd80-fcd7-4b70-bb03-c8a97aa4da93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.981593 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.981628 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27jq9\" (UniqueName: \"kubernetes.io/projected/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-kube-api-access-27jq9\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:03 crc kubenswrapper[4795]: I0320 18:15:03.981638 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6419bd80-fcd7-4b70-bb03-c8a97aa4da93-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.106393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" event={"ID":"6419bd80-fcd7-4b70-bb03-c8a97aa4da93","Type":"ContainerDied","Data":"8148282fa276f46ca897b62036478449efe04e103762f682938e37d456ce8f61"} Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.106454 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8148282fa276f46ca897b62036478449efe04e103762f682938e37d456ce8f61" Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.106476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-bdtnd" Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.771897 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 18:15:04 crc kubenswrapper[4795]: I0320 18:15:04.780898 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-p6qhn"] Mar 20 18:15:05 crc kubenswrapper[4795]: I0320 18:15:05.279026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6892589-ca9a-45cc-8991-ab0029e67e3c" path="/var/lib/kubelet/pods/a6892589-ca9a-45cc-8991-ab0029e67e3c/volumes" Mar 20 18:15:11 crc kubenswrapper[4795]: I0320 18:15:11.070190 4795 scope.go:117] "RemoveContainer" containerID="c957ead85ece246848e605f8f78734d00ae750bd985db9f200ae787909bd1425" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.158922 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:16:00 crc kubenswrapper[4795]: E0320 18:16:00.159942 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.159957 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.160172 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6419bd80-fcd7-4b70-bb03-c8a97aa4da93" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.160826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.163912 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.164102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.164111 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.176585 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.261001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"auto-csr-approver-29567176-jlvh2\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.363012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"auto-csr-approver-29567176-jlvh2\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.393357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"auto-csr-approver-29567176-jlvh2\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.485959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:00 crc kubenswrapper[4795]: I0320 18:16:00.978614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:16:01 crc kubenswrapper[4795]: W0320 18:16:01.004316 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6847a127_2563_4611_aa3c_5de097af7485.slice/crio-8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725 WatchSource:0}: Error finding container 8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725: Status 404 returned error can't find the container with id 8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725 Mar 20 18:16:01 crc kubenswrapper[4795]: I0320 18:16:01.730340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" event={"ID":"6847a127-2563-4611-aa3c-5de097af7485","Type":"ContainerStarted","Data":"8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725"} Mar 20 18:16:02 crc kubenswrapper[4795]: I0320 18:16:02.747012 4795 generic.go:334] "Generic (PLEG): container finished" podID="6847a127-2563-4611-aa3c-5de097af7485" containerID="fcb0c54c2a527f381862afe1aaeeba3ced38b835a91522600710892ac634473c" exitCode=0 Mar 20 18:16:02 crc kubenswrapper[4795]: I0320 18:16:02.747121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" event={"ID":"6847a127-2563-4611-aa3c-5de097af7485","Type":"ContainerDied","Data":"fcb0c54c2a527f381862afe1aaeeba3ced38b835a91522600710892ac634473c"} Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.313575 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.353749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") pod \"6847a127-2563-4611-aa3c-5de097af7485\" (UID: \"6847a127-2563-4611-aa3c-5de097af7485\") " Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.362050 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg" (OuterVolumeSpecName: "kube-api-access-xz2tg") pod "6847a127-2563-4611-aa3c-5de097af7485" (UID: "6847a127-2563-4611-aa3c-5de097af7485"). InnerVolumeSpecName "kube-api-access-xz2tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.456087 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2tg\" (UniqueName: \"kubernetes.io/projected/6847a127-2563-4611-aa3c-5de097af7485-kube-api-access-xz2tg\") on node \"crc\" DevicePath \"\"" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.771861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" event={"ID":"6847a127-2563-4611-aa3c-5de097af7485","Type":"ContainerDied","Data":"8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725"} Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.771902 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-jlvh2" Mar 20 18:16:04 crc kubenswrapper[4795]: I0320 18:16:04.771907 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e2d2a4673f6a740cd811fac0341342830e80a67e6ff0a33474b4939012e8725" Mar 20 18:16:05 crc kubenswrapper[4795]: I0320 18:16:05.399480 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:16:05 crc kubenswrapper[4795]: I0320 18:16:05.410104 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-nh84v"] Mar 20 18:16:07 crc kubenswrapper[4795]: I0320 18:16:07.270834 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37c7446-ee1e-4fba-b0ff-4b0002aa14b4" path="/var/lib/kubelet/pods/e37c7446-ee1e-4fba-b0ff-4b0002aa14b4/volumes" Mar 20 18:16:11 crc kubenswrapper[4795]: I0320 18:16:11.131632 4795 scope.go:117] "RemoveContainer" containerID="678d64f2bd651427dd881bafa2432c3aa8854cce5009833f7ed8c8ba9e0a700c" Mar 20 18:16:11 crc kubenswrapper[4795]: I0320 18:16:11.300397 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:16:11 crc kubenswrapper[4795]: I0320 18:16:11.300473 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:16:41 crc kubenswrapper[4795]: I0320 18:16:41.299991 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:16:41 crc kubenswrapper[4795]: I0320 18:16:41.300771 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.777470 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:10 crc kubenswrapper[4795]: E0320 18:17:10.778617 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6847a127-2563-4611-aa3c-5de097af7485" containerName="oc" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.778635 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6847a127-2563-4611-aa3c-5de097af7485" containerName="oc" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.779020 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6847a127-2563-4611-aa3c-5de097af7485" containerName="oc" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.780766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.803147 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.874810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.874880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.874942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.977478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.977606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.977770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.978094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:10 crc kubenswrapper[4795]: I0320 18:17:10.978170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.003619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"certified-operators-fvx6c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.115798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.301980 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.302283 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.302332 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.303123 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.303181 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d" gracePeriod=600 Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.519038 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d" exitCode=0 Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.519078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d"} Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.519110 4795 scope.go:117] "RemoveContainer" containerID="dee39f2055a7cf58e36eb09d042f050c11c32d9f4b227873d7188176fb8311d5" Mar 20 18:17:11 crc kubenswrapper[4795]: W0320 18:17:11.604092 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod693b9900_2584_45d2_9cf6_9bc22c3c010c.slice/crio-a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105 WatchSource:0}: Error finding container a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105: Status 404 returned error can't find the container with id a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105 Mar 20 18:17:11 crc kubenswrapper[4795]: I0320 18:17:11.604542 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.534839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8"} Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.539115 4795 generic.go:334] "Generic (PLEG): container finished" podID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" exitCode=0 Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.539189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e"} Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.539236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerStarted","Data":"a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105"} Mar 20 18:17:12 crc kubenswrapper[4795]: I0320 18:17:12.542986 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:17:13 crc kubenswrapper[4795]: I0320 18:17:13.553678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerStarted","Data":"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377"} Mar 20 18:17:15 crc kubenswrapper[4795]: I0320 18:17:15.568849 4795 generic.go:334] "Generic (PLEG): container finished" podID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" exitCode=0 Mar 20 18:17:15 crc kubenswrapper[4795]: I0320 18:17:15.568927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377"} Mar 20 18:17:16 crc kubenswrapper[4795]: I0320 18:17:16.583566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerStarted","Data":"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049"} Mar 20 18:17:16 crc kubenswrapper[4795]: I0320 18:17:16.605775 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fvx6c" podStartSLOduration=3.15985909 podStartE2EDuration="6.605759674s" podCreationTimestamp="2026-03-20 18:17:10 +0000 UTC" firstStartedPulling="2026-03-20 18:17:12.542492025 +0000 UTC m=+3576.000523606" lastFinishedPulling="2026-03-20 18:17:15.988392649 +0000 UTC m=+3579.446424190" observedRunningTime="2026-03-20 18:17:16.602344087 +0000 UTC m=+3580.060375628" watchObservedRunningTime="2026-03-20 18:17:16.605759674 +0000 UTC m=+3580.063791215" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.116681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.117253 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.178965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.679788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:21 crc kubenswrapper[4795]: I0320 18:17:21.724797 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:23 crc kubenswrapper[4795]: I0320 18:17:23.667462 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fvx6c" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" containerID="cri-o://c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" gracePeriod=2 Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.248648 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.355148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") pod \"693b9900-2584-45d2-9cf6-9bc22c3c010c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.355985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") pod \"693b9900-2584-45d2-9cf6-9bc22c3c010c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.356088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") pod \"693b9900-2584-45d2-9cf6-9bc22c3c010c\" (UID: \"693b9900-2584-45d2-9cf6-9bc22c3c010c\") " Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.358169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities" (OuterVolumeSpecName: "utilities") pod "693b9900-2584-45d2-9cf6-9bc22c3c010c" (UID: "693b9900-2584-45d2-9cf6-9bc22c3c010c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.367753 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v" (OuterVolumeSpecName: "kube-api-access-7lt6v") pod "693b9900-2584-45d2-9cf6-9bc22c3c010c" (UID: "693b9900-2584-45d2-9cf6-9bc22c3c010c"). InnerVolumeSpecName "kube-api-access-7lt6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.422051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "693b9900-2584-45d2-9cf6-9bc22c3c010c" (UID: "693b9900-2584-45d2-9cf6-9bc22c3c010c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.458052 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.458087 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/693b9900-2584-45d2-9cf6-9bc22c3c010c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.458097 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lt6v\" (UniqueName: \"kubernetes.io/projected/693b9900-2584-45d2-9cf6-9bc22c3c010c-kube-api-access-7lt6v\") on node \"crc\" DevicePath \"\"" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.680800 4795 generic.go:334] "Generic (PLEG): container finished" podID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" exitCode=0 Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.680839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049"} Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.681156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvx6c" event={"ID":"693b9900-2584-45d2-9cf6-9bc22c3c010c","Type":"ContainerDied","Data":"a68d49abbad2da526d582af2866a8efeaa8eaeef6ab510b1836b5ed1e1ce0105"} Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.680923 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvx6c" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.681177 4795 scope.go:117] "RemoveContainer" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.718639 4795 scope.go:117] "RemoveContainer" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.725298 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.734928 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fvx6c"] Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.752184 4795 scope.go:117] "RemoveContainer" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.807406 4795 scope.go:117] "RemoveContainer" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" Mar 20 18:17:24 crc kubenswrapper[4795]: E0320 18:17:24.807975 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049\": container with ID starting with c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049 not found: ID does not exist" containerID="c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808011 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049"} err="failed to get container status \"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049\": rpc error: code = NotFound desc = could not find container \"c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049\": container with ID starting with c70d722b737af345eb862ea92f33672704ec58c9115b5aa4917a6708ad053049 not found: ID does not exist" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808038 4795 scope.go:117] "RemoveContainer" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" Mar 20 18:17:24 crc kubenswrapper[4795]: E0320 18:17:24.808521 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377\": container with ID starting with 0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377 not found: ID does not exist" containerID="0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808587 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377"} err="failed to get container status \"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377\": rpc error: code = NotFound desc = could not find container \"0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377\": container with ID starting with 0e57879de57a3dd7261c3da3b46fe3e2daf810a3d2244f5e069d4afec4216377 not found: ID does not exist" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.808628 4795 scope.go:117] "RemoveContainer" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" Mar 20 18:17:24 crc kubenswrapper[4795]: E0320 18:17:24.809089 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e\": container with ID starting with f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e not found: ID does not exist" containerID="f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e" Mar 20 18:17:24 crc kubenswrapper[4795]: I0320 18:17:24.809121 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e"} err="failed to get container status \"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e\": rpc error: code = NotFound desc = could not find container \"f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e\": container with ID starting with f175baf3cafe49e373d2fee473f4b33d0ff860af6ccd4bf87ac89416063e6a9e not found: ID does not exist" Mar 20 18:17:25 crc kubenswrapper[4795]: I0320 18:17:25.273984 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" path="/var/lib/kubelet/pods/693b9900-2584-45d2-9cf6-9bc22c3c010c/volumes" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.180378 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:18:00 crc kubenswrapper[4795]: E0320 18:18:00.181221 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-content" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181234 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-content" Mar 20 18:18:00 crc kubenswrapper[4795]: E0320 18:18:00.181253 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-utilities" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="extract-utilities" Mar 20 18:18:00 crc kubenswrapper[4795]: E0320 18:18:00.181276 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.181458 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="693b9900-2584-45d2-9cf6-9bc22c3c010c" containerName="registry-server" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.182097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.186895 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.187377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.187972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.210123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.290735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"auto-csr-approver-29567178-vbj6l\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.393316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"auto-csr-approver-29567178-vbj6l\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.434598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"auto-csr-approver-29567178-vbj6l\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:00 crc kubenswrapper[4795]: I0320 18:18:00.505199 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:01 crc kubenswrapper[4795]: I0320 18:18:00.999271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:18:01 crc kubenswrapper[4795]: I0320 18:18:01.113523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" event={"ID":"cf1bb697-899c-48fe-984a-61258e78cd87","Type":"ContainerStarted","Data":"9dbe8e0a3fbec037daa28087ca926302146d2f153b7efd96de4164b0d50b1132"} Mar 20 18:18:03 crc kubenswrapper[4795]: I0320 18:18:03.142218 4795 generic.go:334] "Generic (PLEG): container finished" podID="cf1bb697-899c-48fe-984a-61258e78cd87" containerID="78ef24a78e7dcac7e46de63dc467ed76c00bbe2831c0a8a33ad6b914782524d5" exitCode=0 Mar 20 18:18:03 crc kubenswrapper[4795]: I0320 18:18:03.142316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" event={"ID":"cf1bb697-899c-48fe-984a-61258e78cd87","Type":"ContainerDied","Data":"78ef24a78e7dcac7e46de63dc467ed76c00bbe2831c0a8a33ad6b914782524d5"} Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.613291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.778527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") pod \"cf1bb697-899c-48fe-984a-61258e78cd87\" (UID: \"cf1bb697-899c-48fe-984a-61258e78cd87\") " Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.785931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6" (OuterVolumeSpecName: "kube-api-access-fhcj6") pod "cf1bb697-899c-48fe-984a-61258e78cd87" (UID: "cf1bb697-899c-48fe-984a-61258e78cd87"). InnerVolumeSpecName "kube-api-access-fhcj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:04 crc kubenswrapper[4795]: I0320 18:18:04.882210 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhcj6\" (UniqueName: \"kubernetes.io/projected/cf1bb697-899c-48fe-984a-61258e78cd87-kube-api-access-fhcj6\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.167633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" event={"ID":"cf1bb697-899c-48fe-984a-61258e78cd87","Type":"ContainerDied","Data":"9dbe8e0a3fbec037daa28087ca926302146d2f153b7efd96de4164b0d50b1132"} Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.167762 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dbe8e0a3fbec037daa28087ca926302146d2f153b7efd96de4164b0d50b1132" Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.168120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-vbj6l" Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.720914 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:18:05 crc kubenswrapper[4795]: I0320 18:18:05.731606 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-nmn82"] Mar 20 18:18:07 crc kubenswrapper[4795]: I0320 18:18:07.265329 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c43e03-d618-403e-874e-ff8337f97372" path="/var/lib/kubelet/pods/c9c43e03-d618-403e-874e-ff8337f97372/volumes" Mar 20 18:18:11 crc kubenswrapper[4795]: I0320 18:18:11.222559 4795 scope.go:117] "RemoveContainer" containerID="09c9bfb67ac08b2ff7e4b90082be2f738cf4aa4c4b3eb977b33ed1ee55c790cb" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.959714 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:34 crc kubenswrapper[4795]: E0320 18:18:34.960682 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" containerName="oc" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.960713 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" containerName="oc" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.960929 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" containerName="oc" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.962677 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:34 crc kubenswrapper[4795]: I0320 18:18:34.984059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.098561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.098644 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.098886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.201986 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.202207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.231853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"redhat-marketplace-99xjf\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.289824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:35 crc kubenswrapper[4795]: I0320 18:18:35.816759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:35 crc kubenswrapper[4795]: W0320 18:18:35.818152 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod282a97b6_2e41_4e26_a941_77a9c94206cb.slice/crio-8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a WatchSource:0}: Error finding container 8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a: Status 404 returned error can't find the container with id 8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a Mar 20 18:18:36 crc kubenswrapper[4795]: I0320 18:18:36.522122 4795 generic.go:334] "Generic (PLEG): container finished" podID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" exitCode=0 Mar 20 18:18:36 crc kubenswrapper[4795]: I0320 18:18:36.522241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c"} Mar 20 18:18:36 crc kubenswrapper[4795]: I0320 18:18:36.522506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerStarted","Data":"8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a"} Mar 20 18:18:37 crc kubenswrapper[4795]: I0320 18:18:37.539436 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerStarted","Data":"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5"} Mar 20 18:18:38 crc kubenswrapper[4795]: I0320 18:18:38.577969 4795 generic.go:334] "Generic (PLEG): container finished" podID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" exitCode=0 Mar 20 18:18:38 crc kubenswrapper[4795]: I0320 18:18:38.578027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5"} Mar 20 18:18:39 crc kubenswrapper[4795]: I0320 18:18:39.591002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerStarted","Data":"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4"} Mar 20 18:18:39 crc kubenswrapper[4795]: I0320 18:18:39.619785 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-99xjf" podStartSLOduration=2.990488324 podStartE2EDuration="5.619761594s" podCreationTimestamp="2026-03-20 18:18:34 +0000 UTC" firstStartedPulling="2026-03-20 18:18:36.524653009 +0000 UTC m=+3659.982684550" lastFinishedPulling="2026-03-20 18:18:39.153926289 +0000 UTC m=+3662.611957820" observedRunningTime="2026-03-20 18:18:39.609086962 +0000 UTC m=+3663.067118543" watchObservedRunningTime="2026-03-20 18:18:39.619761594 +0000 UTC m=+3663.077793145" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.290849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.291549 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.348325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.692959 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:45 crc kubenswrapper[4795]: I0320 18:18:45.745547 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:47 crc kubenswrapper[4795]: I0320 18:18:47.670783 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-99xjf" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" containerID="cri-o://b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" gracePeriod=2 Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.235023 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.373367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") pod \"282a97b6-2e41-4e26-a941-77a9c94206cb\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.373422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") pod \"282a97b6-2e41-4e26-a941-77a9c94206cb\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.373626 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") pod \"282a97b6-2e41-4e26-a941-77a9c94206cb\" (UID: \"282a97b6-2e41-4e26-a941-77a9c94206cb\") " Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.374715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities" (OuterVolumeSpecName: "utilities") pod "282a97b6-2e41-4e26-a941-77a9c94206cb" (UID: "282a97b6-2e41-4e26-a941-77a9c94206cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.384902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82" (OuterVolumeSpecName: "kube-api-access-k4r82") pod "282a97b6-2e41-4e26-a941-77a9c94206cb" (UID: "282a97b6-2e41-4e26-a941-77a9c94206cb"). InnerVolumeSpecName "kube-api-access-k4r82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.404470 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "282a97b6-2e41-4e26-a941-77a9c94206cb" (UID: "282a97b6-2e41-4e26-a941-77a9c94206cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.475739 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.475774 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/282a97b6-2e41-4e26-a941-77a9c94206cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.475785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4r82\" (UniqueName: \"kubernetes.io/projected/282a97b6-2e41-4e26-a941-77a9c94206cb-kube-api-access-k4r82\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681752 4795 generic.go:334] "Generic (PLEG): container finished" podID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" exitCode=0 Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4"} Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99xjf" event={"ID":"282a97b6-2e41-4e26-a941-77a9c94206cb","Type":"ContainerDied","Data":"8cc1ef88ff1d93f454f1b32320bba18f8ff1fb8b9090c61383c2422173e8185a"} Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.681868 4795 scope.go:117] "RemoveContainer" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.682005 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99xjf" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.723343 4795 scope.go:117] "RemoveContainer" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.726358 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.733159 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-99xjf"] Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.744524 4795 scope.go:117] "RemoveContainer" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.819417 4795 scope.go:117] "RemoveContainer" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" Mar 20 18:18:48 crc kubenswrapper[4795]: E0320 18:18:48.820051 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4\": container with ID starting with b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4 not found: ID does not exist" containerID="b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820104 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4"} err="failed to get container status \"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4\": rpc error: code = NotFound desc = could not find container \"b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4\": container with ID starting with b86cf5e2c8c0efb7168628b530d194f1aa56912114dcf66eb3bffb633562d3a4 not found: ID does not exist" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820146 4795 scope.go:117] "RemoveContainer" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" Mar 20 18:18:48 crc kubenswrapper[4795]: E0320 18:18:48.820500 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5\": container with ID starting with eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5 not found: ID does not exist" containerID="eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820539 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5"} err="failed to get container status \"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5\": rpc error: code = NotFound desc = could not find container \"eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5\": container with ID starting with eb8784dce5475676abe9a8abc9f48a004e26ae14ece49dae7eccd4d935427de5 not found: ID does not exist" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820564 4795 scope.go:117] "RemoveContainer" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" Mar 20 18:18:48 crc kubenswrapper[4795]: E0320 18:18:48.820885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c\": container with ID starting with b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c not found: ID does not exist" containerID="b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c" Mar 20 18:18:48 crc kubenswrapper[4795]: I0320 18:18:48.820929 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c"} err="failed to get container status \"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c\": rpc error: code = NotFound desc = could not find container \"b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c\": container with ID starting with b6ddb0cadcbf4fb03816e6d6aa131805172b2adc7554120c81d5815349a4783c not found: ID does not exist" Mar 20 18:18:49 crc kubenswrapper[4795]: I0320 18:18:49.278345 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" path="/var/lib/kubelet/pods/282a97b6-2e41-4e26-a941-77a9c94206cb/volumes" Mar 20 18:19:11 crc kubenswrapper[4795]: I0320 18:19:11.300524 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:19:11 crc kubenswrapper[4795]: I0320 18:19:11.301256 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:19:41 crc kubenswrapper[4795]: I0320 18:19:41.301087 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:19:41 crc kubenswrapper[4795]: I0320 18:19:41.301938 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.145904 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:20:00 crc kubenswrapper[4795]: E0320 18:20:00.146794 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-content" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.146809 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-content" Mar 20 18:20:00 crc kubenswrapper[4795]: E0320 18:20:00.146831 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.146837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" Mar 20 18:20:00 crc kubenswrapper[4795]: E0320 18:20:00.146854 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-utilities" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.146861 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="extract-utilities" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.147043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="282a97b6-2e41-4e26-a941-77a9c94206cb" containerName="registry-server" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.147565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.151025 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.151297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.153262 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.154058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.248493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"auto-csr-approver-29567180-hdwfg\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.350711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"auto-csr-approver-29567180-hdwfg\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.369321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"auto-csr-approver-29567180-hdwfg\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.467780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:00 crc kubenswrapper[4795]: I0320 18:20:00.937104 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:20:01 crc kubenswrapper[4795]: I0320 18:20:01.454491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerStarted","Data":"adde32c54b8aabda2e836b8d23281371028b7d7d10f74d51b930d44f71cc947d"} Mar 20 18:20:02 crc kubenswrapper[4795]: I0320 18:20:02.465908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerStarted","Data":"fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf"} Mar 20 18:20:02 crc kubenswrapper[4795]: I0320 18:20:02.499014 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" podStartSLOduration=1.568468051 podStartE2EDuration="2.498976416s" podCreationTimestamp="2026-03-20 18:20:00 +0000 UTC" firstStartedPulling="2026-03-20 18:20:00.933555343 +0000 UTC m=+3744.391586894" lastFinishedPulling="2026-03-20 18:20:01.864063718 +0000 UTC m=+3745.322095259" observedRunningTime="2026-03-20 18:20:02.484085823 +0000 UTC m=+3745.942117374" watchObservedRunningTime="2026-03-20 18:20:02.498976416 +0000 UTC m=+3745.957007997" Mar 20 18:20:03 crc kubenswrapper[4795]: I0320 18:20:03.477609 4795 generic.go:334] "Generic (PLEG): container finished" podID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerID="fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf" exitCode=0 Mar 20 18:20:03 crc kubenswrapper[4795]: I0320 18:20:03.477715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerDied","Data":"fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf"} Mar 20 18:20:04 crc kubenswrapper[4795]: I0320 18:20:04.967912 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.158325 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") pod \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\" (UID: \"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec\") " Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.164236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw" (OuterVolumeSpecName: "kube-api-access-5l6sw") pod "78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" (UID: "78d7fab6-a6ea-4dee-bd81-84a6cfb81aec"). InnerVolumeSpecName "kube-api-access-5l6sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.260002 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6sw\" (UniqueName: \"kubernetes.io/projected/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec-kube-api-access-5l6sw\") on node \"crc\" DevicePath \"\"" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.534174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" event={"ID":"78d7fab6-a6ea-4dee-bd81-84a6cfb81aec","Type":"ContainerDied","Data":"adde32c54b8aabda2e836b8d23281371028b7d7d10f74d51b930d44f71cc947d"} Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.534763 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adde32c54b8aabda2e836b8d23281371028b7d7d10f74d51b930d44f71cc947d" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.534943 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-hdwfg" Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.577958 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:20:05 crc kubenswrapper[4795]: I0320 18:20:05.586511 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-8lkms"] Mar 20 18:20:07 crc kubenswrapper[4795]: I0320 18:20:07.270393 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d290fd42-4040-428e-8af1-8091250112e7" path="/var/lib/kubelet/pods/d290fd42-4040-428e-8af1-8091250112e7/volumes" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.300481 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.301099 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.301160 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.302578 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.302669 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" gracePeriod=600 Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.366917 4795 scope.go:117] "RemoveContainer" containerID="6d1a2371250aa4bfa6255ea0f649377d871a255b30df541957c8c5e80c58e7c1" Mar 20 18:20:11 crc kubenswrapper[4795]: E0320 18:20:11.461839 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.606823 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" exitCode=0 Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.606885 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8"} Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.606934 4795 scope.go:117] "RemoveContainer" containerID="a9fdc4b7059add81d945919b10964cde2cd3bd8ed5c14c3eb55d66a07cc7643d" Mar 20 18:20:11 crc kubenswrapper[4795]: I0320 18:20:11.607839 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:11 crc kubenswrapper[4795]: E0320 18:20:11.608386 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:25 crc kubenswrapper[4795]: I0320 18:20:25.252631 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:25 crc kubenswrapper[4795]: E0320 18:20:25.253375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:40 crc kubenswrapper[4795]: I0320 18:20:40.252232 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:40 crc kubenswrapper[4795]: E0320 18:20:40.253434 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:20:55 crc kubenswrapper[4795]: I0320 18:20:55.252785 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:20:55 crc kubenswrapper[4795]: E0320 18:20:55.253598 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:08 crc kubenswrapper[4795]: I0320 18:21:08.252027 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:08 crc kubenswrapper[4795]: E0320 18:21:08.253059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:23 crc kubenswrapper[4795]: I0320 18:21:23.252712 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:23 crc kubenswrapper[4795]: E0320 18:21:23.253376 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:38 crc kubenswrapper[4795]: I0320 18:21:38.252792 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:38 crc kubenswrapper[4795]: E0320 18:21:38.253979 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:21:53 crc kubenswrapper[4795]: I0320 18:21:53.253124 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:21:53 crc kubenswrapper[4795]: E0320 18:21:53.254388 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.153089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:22:00 crc kubenswrapper[4795]: E0320 18:22:00.154173 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.154189 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.154415 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" containerName="oc" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.155143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.162351 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.162386 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.162650 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.174808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.278438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"auto-csr-approver-29567182-5tldr\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.381114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"auto-csr-approver-29567182-5tldr\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.401170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"auto-csr-approver-29567182-5tldr\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.481731 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:00 crc kubenswrapper[4795]: I0320 18:22:00.926749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:22:01 crc kubenswrapper[4795]: I0320 18:22:01.748431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-5tldr" event={"ID":"478a3729-f417-4458-b0c5-562ed9c72252","Type":"ContainerStarted","Data":"0c73ef664439c6cd1a470ac299c5d4845b2d54a38cb2b91cfd6d7b76f6d9c6d7"} Mar 20 18:22:02 crc kubenswrapper[4795]: E0320 18:22:02.613577 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478a3729_f417_4458_b0c5_562ed9c72252.slice/crio-conmon-693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478a3729_f417_4458_b0c5_562ed9c72252.slice/crio-693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b.scope\": RecentStats: unable to find data in memory cache]" Mar 20 18:22:02 crc kubenswrapper[4795]: I0320 18:22:02.766671 4795 generic.go:334] "Generic (PLEG): container finished" podID="478a3729-f417-4458-b0c5-562ed9c72252" containerID="693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b" exitCode=0 Mar 20 18:22:02 crc kubenswrapper[4795]: I0320 18:22:02.766768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-5tldr" event={"ID":"478a3729-f417-4458-b0c5-562ed9c72252","Type":"ContainerDied","Data":"693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b"} Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.273072 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.354917 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") pod \"478a3729-f417-4458-b0c5-562ed9c72252\" (UID: \"478a3729-f417-4458-b0c5-562ed9c72252\") " Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.361530 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq" (OuterVolumeSpecName: "kube-api-access-6pwkq") pod "478a3729-f417-4458-b0c5-562ed9c72252" (UID: "478a3729-f417-4458-b0c5-562ed9c72252"). InnerVolumeSpecName "kube-api-access-6pwkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.457257 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pwkq\" (UniqueName: \"kubernetes.io/projected/478a3729-f417-4458-b0c5-562ed9c72252-kube-api-access-6pwkq\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.789738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-5tldr" event={"ID":"478a3729-f417-4458-b0c5-562ed9c72252","Type":"ContainerDied","Data":"0c73ef664439c6cd1a470ac299c5d4845b2d54a38cb2b91cfd6d7b76f6d9c6d7"} Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.789784 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c73ef664439c6cd1a470ac299c5d4845b2d54a38cb2b91cfd6d7b76f6d9c6d7" Mar 20 18:22:04 crc kubenswrapper[4795]: I0320 18:22:04.789844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-5tldr" Mar 20 18:22:05 crc kubenswrapper[4795]: I0320 18:22:05.412003 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:22:05 crc kubenswrapper[4795]: I0320 18:22:05.429942 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-jlvh2"] Mar 20 18:22:07 crc kubenswrapper[4795]: I0320 18:22:07.264741 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:07 crc kubenswrapper[4795]: I0320 18:22:07.264959 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6847a127-2563-4611-aa3c-5de097af7485" path="/var/lib/kubelet/pods/6847a127-2563-4611-aa3c-5de097af7485/volumes" Mar 20 18:22:07 crc kubenswrapper[4795]: E0320 18:22:07.265043 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:11 crc kubenswrapper[4795]: I0320 18:22:11.494423 4795 scope.go:117] "RemoveContainer" containerID="fcb0c54c2a527f381862afe1aaeeba3ced38b835a91522600710892ac634473c" Mar 20 18:22:20 crc kubenswrapper[4795]: I0320 18:22:20.251783 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:20 crc kubenswrapper[4795]: E0320 18:22:20.252526 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:33 crc kubenswrapper[4795]: I0320 18:22:33.255130 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:33 crc kubenswrapper[4795]: E0320 18:22:33.260237 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:44 crc kubenswrapper[4795]: I0320 18:22:44.252002 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:44 crc kubenswrapper[4795]: E0320 18:22:44.253053 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:22:59 crc kubenswrapper[4795]: I0320 18:22:59.256118 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:22:59 crc kubenswrapper[4795]: E0320 18:22:59.257082 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:14 crc kubenswrapper[4795]: I0320 18:23:14.251613 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:14 crc kubenswrapper[4795]: E0320 18:23:14.252326 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:29 crc kubenswrapper[4795]: I0320 18:23:29.253508 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:29 crc kubenswrapper[4795]: E0320 18:23:29.254344 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:44 crc kubenswrapper[4795]: I0320 18:23:44.253138 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:44 crc kubenswrapper[4795]: E0320 18:23:44.254447 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:23:58 crc kubenswrapper[4795]: I0320 18:23:58.252514 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:23:58 crc kubenswrapper[4795]: E0320 18:23:58.253356 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.151708 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:24:00 crc kubenswrapper[4795]: E0320 18:24:00.152536 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478a3729-f417-4458-b0c5-562ed9c72252" containerName="oc" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.152553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="478a3729-f417-4458-b0c5-562ed9c72252" containerName="oc" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.152848 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="478a3729-f417-4458-b0c5-562ed9c72252" containerName="oc" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.153629 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.157186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.157618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.159249 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.164436 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.235343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"auto-csr-approver-29567184-gzns5\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.337026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"auto-csr-approver-29567184-gzns5\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.355444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"auto-csr-approver-29567184-gzns5\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.475356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.928658 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:24:00 crc kubenswrapper[4795]: W0320 18:24:00.937253 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod133e7bff_461c_4450_bf3b_8d43791045a4.slice/crio-4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852 WatchSource:0}: Error finding container 4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852: Status 404 returned error can't find the container with id 4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852 Mar 20 18:24:00 crc kubenswrapper[4795]: I0320 18:24:00.940598 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:24:01 crc kubenswrapper[4795]: I0320 18:24:01.944975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-gzns5" event={"ID":"133e7bff-461c-4450-bf3b-8d43791045a4","Type":"ContainerStarted","Data":"4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852"} Mar 20 18:24:02 crc kubenswrapper[4795]: I0320 18:24:02.965193 4795 generic.go:334] "Generic (PLEG): container finished" podID="133e7bff-461c-4450-bf3b-8d43791045a4" containerID="ccece14cbf4b4c8c9889d9ebad0a41bd1c87c88349be10c251ccd8a08eb4cac4" exitCode=0 Mar 20 18:24:02 crc kubenswrapper[4795]: I0320 18:24:02.965262 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-gzns5" event={"ID":"133e7bff-461c-4450-bf3b-8d43791045a4","Type":"ContainerDied","Data":"ccece14cbf4b4c8c9889d9ebad0a41bd1c87c88349be10c251ccd8a08eb4cac4"} Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.401625 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.512762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") pod \"133e7bff-461c-4450-bf3b-8d43791045a4\" (UID: \"133e7bff-461c-4450-bf3b-8d43791045a4\") " Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.523007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq" (OuterVolumeSpecName: "kube-api-access-qzfmq") pod "133e7bff-461c-4450-bf3b-8d43791045a4" (UID: "133e7bff-461c-4450-bf3b-8d43791045a4"). InnerVolumeSpecName "kube-api-access-qzfmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:24:04 crc kubenswrapper[4795]: I0320 18:24:04.614943 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzfmq\" (UniqueName: \"kubernetes.io/projected/133e7bff-461c-4450-bf3b-8d43791045a4-kube-api-access-qzfmq\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.003459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-gzns5" event={"ID":"133e7bff-461c-4450-bf3b-8d43791045a4","Type":"ContainerDied","Data":"4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852"} Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.003496 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca157210d09550b06e36df966d06ef94e3f4c6f430326733d33dcea5c821852" Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.004016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-gzns5" Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.484201 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:24:05 crc kubenswrapper[4795]: I0320 18:24:05.493805 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-vbj6l"] Mar 20 18:24:07 crc kubenswrapper[4795]: I0320 18:24:07.268147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1bb697-899c-48fe-984a-61258e78cd87" path="/var/lib/kubelet/pods/cf1bb697-899c-48fe-984a-61258e78cd87/volumes" Mar 20 18:24:09 crc kubenswrapper[4795]: I0320 18:24:09.256440 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:09 crc kubenswrapper[4795]: E0320 18:24:09.257113 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:11 crc kubenswrapper[4795]: I0320 18:24:11.617393 4795 scope.go:117] "RemoveContainer" containerID="78ef24a78e7dcac7e46de63dc467ed76c00bbe2831c0a8a33ad6b914782524d5" Mar 20 18:24:21 crc kubenswrapper[4795]: I0320 18:24:21.251999 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:21 crc kubenswrapper[4795]: E0320 18:24:21.253521 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:36 crc kubenswrapper[4795]: I0320 18:24:36.251928 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:36 crc kubenswrapper[4795]: E0320 18:24:36.252783 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:24:50 crc kubenswrapper[4795]: I0320 18:24:50.252958 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:24:50 crc kubenswrapper[4795]: E0320 18:24:50.253869 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.252342 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:25:04 crc kubenswrapper[4795]: E0320 18:25:04.253222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.567157 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:04 crc kubenswrapper[4795]: E0320 18:25:04.567505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" containerName="oc" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.567522 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" containerName="oc" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.567746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" containerName="oc" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.568997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.595847 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.672195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.672412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.672465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.773725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.773880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.773924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.774354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.774450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.792139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"redhat-operators-s45dr\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:04 crc kubenswrapper[4795]: I0320 18:25:04.909109 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:05 crc kubenswrapper[4795]: W0320 18:25:05.389013 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5534910c_2643_4a80_8d50_82267e2567e0.slice/crio-2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82 WatchSource:0}: Error finding container 2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82: Status 404 returned error can't find the container with id 2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82 Mar 20 18:25:05 crc kubenswrapper[4795]: I0320 18:25:05.391373 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:05 crc kubenswrapper[4795]: I0320 18:25:05.544383 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerStarted","Data":"2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82"} Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.368402 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.371600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.382587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.504560 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.504639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.504740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.552475 4795 generic.go:334] "Generic (PLEG): container finished" podID="5534910c-2643-4a80-8d50-82267e2567e0" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" exitCode=0 Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.552513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6"} Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.606894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.606964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.607025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.607768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.607777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.632954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"community-operators-8s8nl\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:06 crc kubenswrapper[4795]: I0320 18:25:06.691189 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.219283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.565118 4795 generic.go:334] "Generic (PLEG): container finished" podID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" exitCode=0 Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.565520 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1"} Mar 20 18:25:07 crc kubenswrapper[4795]: I0320 18:25:07.565568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerStarted","Data":"0988077d812aa21cebf551a9213e7cdabc69aef5852f2e0a86b4777259c6f40a"} Mar 20 18:25:08 crc kubenswrapper[4795]: I0320 18:25:08.593355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerStarted","Data":"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1"} Mar 20 18:25:08 crc kubenswrapper[4795]: I0320 18:25:08.596080 4795 generic.go:334] "Generic (PLEG): container finished" podID="5534910c-2643-4a80-8d50-82267e2567e0" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" exitCode=0 Mar 20 18:25:08 crc kubenswrapper[4795]: I0320 18:25:08.596115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4"} Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.606520 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerStarted","Data":"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a"} Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.608961 4795 generic.go:334] "Generic (PLEG): container finished" podID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" exitCode=0 Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.609016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1"} Mar 20 18:25:09 crc kubenswrapper[4795]: I0320 18:25:09.634060 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s45dr" podStartSLOduration=3.19629929 podStartE2EDuration="5.634036269s" podCreationTimestamp="2026-03-20 18:25:04 +0000 UTC" firstStartedPulling="2026-03-20 18:25:06.553993886 +0000 UTC m=+4050.012025427" lastFinishedPulling="2026-03-20 18:25:08.991730875 +0000 UTC m=+4052.449762406" observedRunningTime="2026-03-20 18:25:09.626888117 +0000 UTC m=+4053.084919658" watchObservedRunningTime="2026-03-20 18:25:09.634036269 +0000 UTC m=+4053.092067820" Mar 20 18:25:10 crc kubenswrapper[4795]: I0320 18:25:10.619283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerStarted","Data":"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8"} Mar 20 18:25:10 crc kubenswrapper[4795]: I0320 18:25:10.644558 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8s8nl" podStartSLOduration=1.967046352 podStartE2EDuration="4.644536491s" podCreationTimestamp="2026-03-20 18:25:06 +0000 UTC" firstStartedPulling="2026-03-20 18:25:07.567796171 +0000 UTC m=+4051.025827752" lastFinishedPulling="2026-03-20 18:25:10.24528634 +0000 UTC m=+4053.703317891" observedRunningTime="2026-03-20 18:25:10.641024701 +0000 UTC m=+4054.099056262" watchObservedRunningTime="2026-03-20 18:25:10.644536491 +0000 UTC m=+4054.102568042" Mar 20 18:25:14 crc kubenswrapper[4795]: I0320 18:25:14.909948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:14 crc kubenswrapper[4795]: I0320 18:25:14.910228 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:15 crc kubenswrapper[4795]: I0320 18:25:15.960309 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s45dr" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" probeResult="failure" output=< Mar 20 18:25:15 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:25:15 crc kubenswrapper[4795]: > Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.253269 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.691922 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.692212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:16 crc kubenswrapper[4795]: I0320 18:25:16.749765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:17 crc kubenswrapper[4795]: I0320 18:25:17.711332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc"} Mar 20 18:25:17 crc kubenswrapper[4795]: I0320 18:25:17.791271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:17 crc kubenswrapper[4795]: I0320 18:25:17.858368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:19 crc kubenswrapper[4795]: I0320 18:25:19.731006 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8s8nl" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" containerID="cri-o://bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" gracePeriod=2 Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.327649 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.403511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") pod \"3336a777-640c-4ec9-a1f7-27a05d6efe01\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.403673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") pod \"3336a777-640c-4ec9-a1f7-27a05d6efe01\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.403928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") pod \"3336a777-640c-4ec9-a1f7-27a05d6efe01\" (UID: \"3336a777-640c-4ec9-a1f7-27a05d6efe01\") " Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.406086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities" (OuterVolumeSpecName: "utilities") pod "3336a777-640c-4ec9-a1f7-27a05d6efe01" (UID: "3336a777-640c-4ec9-a1f7-27a05d6efe01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.411073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf" (OuterVolumeSpecName: "kube-api-access-8qlzf") pod "3336a777-640c-4ec9-a1f7-27a05d6efe01" (UID: "3336a777-640c-4ec9-a1f7-27a05d6efe01"). InnerVolumeSpecName "kube-api-access-8qlzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.462033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3336a777-640c-4ec9-a1f7-27a05d6efe01" (UID: "3336a777-640c-4ec9-a1f7-27a05d6efe01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.506783 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.506825 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qlzf\" (UniqueName: \"kubernetes.io/projected/3336a777-640c-4ec9-a1f7-27a05d6efe01-kube-api-access-8qlzf\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.506844 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336a777-640c-4ec9-a1f7-27a05d6efe01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742095 4795 generic.go:334] "Generic (PLEG): container finished" podID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" exitCode=0 Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8"} Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s8nl" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742404 4795 scope.go:117] "RemoveContainer" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.742394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s8nl" event={"ID":"3336a777-640c-4ec9-a1f7-27a05d6efe01","Type":"ContainerDied","Data":"0988077d812aa21cebf551a9213e7cdabc69aef5852f2e0a86b4777259c6f40a"} Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.768460 4795 scope.go:117] "RemoveContainer" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.773321 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:20 crc kubenswrapper[4795]: I0320 18:25:20.781806 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8s8nl"] Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.271141 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" path="/var/lib/kubelet/pods/3336a777-640c-4ec9-a1f7-27a05d6efe01/volumes" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.284211 4795 scope.go:117] "RemoveContainer" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.365951 4795 scope.go:117] "RemoveContainer" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" Mar 20 18:25:21 crc kubenswrapper[4795]: E0320 18:25:21.366573 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8\": container with ID starting with bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8 not found: ID does not exist" containerID="bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.366722 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8"} err="failed to get container status \"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8\": rpc error: code = NotFound desc = could not find container \"bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8\": container with ID starting with bb3f505f5a099123b56f72beeb604982279bcff4aca9f6fc637fefffd5f646f8 not found: ID does not exist" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.366758 4795 scope.go:117] "RemoveContainer" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" Mar 20 18:25:21 crc kubenswrapper[4795]: E0320 18:25:21.367128 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1\": container with ID starting with 3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1 not found: ID does not exist" containerID="3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.367163 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1"} err="failed to get container status \"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1\": rpc error: code = NotFound desc = could not find container \"3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1\": container with ID starting with 3f211f8c89b85880f53bf2007629c0264ab12d7fa21e57ba98227b959dd243a1 not found: ID does not exist" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.367184 4795 scope.go:117] "RemoveContainer" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" Mar 20 18:25:21 crc kubenswrapper[4795]: E0320 18:25:21.367423 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1\": container with ID starting with 847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1 not found: ID does not exist" containerID="847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1" Mar 20 18:25:21 crc kubenswrapper[4795]: I0320 18:25:21.367452 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1"} err="failed to get container status \"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1\": rpc error: code = NotFound desc = could not find container \"847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1\": container with ID starting with 847b7479414131e830dbb020eaa6ebe519928d61fbfff7bc4f55dab3bbf1e8d1 not found: ID does not exist" Mar 20 18:25:24 crc kubenswrapper[4795]: I0320 18:25:24.970553 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:25 crc kubenswrapper[4795]: I0320 18:25:25.031082 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:25 crc kubenswrapper[4795]: I0320 18:25:25.324862 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:26 crc kubenswrapper[4795]: I0320 18:25:26.808974 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s45dr" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" containerID="cri-o://9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" gracePeriod=2 Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.316768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.451865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") pod \"5534910c-2643-4a80-8d50-82267e2567e0\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.452010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") pod \"5534910c-2643-4a80-8d50-82267e2567e0\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.452893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") pod \"5534910c-2643-4a80-8d50-82267e2567e0\" (UID: \"5534910c-2643-4a80-8d50-82267e2567e0\") " Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.453733 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities" (OuterVolumeSpecName: "utilities") pod "5534910c-2643-4a80-8d50-82267e2567e0" (UID: "5534910c-2643-4a80-8d50-82267e2567e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.453824 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.458338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg" (OuterVolumeSpecName: "kube-api-access-97lxg") pod "5534910c-2643-4a80-8d50-82267e2567e0" (UID: "5534910c-2643-4a80-8d50-82267e2567e0"). InnerVolumeSpecName "kube-api-access-97lxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.555056 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97lxg\" (UniqueName: \"kubernetes.io/projected/5534910c-2643-4a80-8d50-82267e2567e0-kube-api-access-97lxg\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.598232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5534910c-2643-4a80-8d50-82267e2567e0" (UID: "5534910c-2643-4a80-8d50-82267e2567e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.657191 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5534910c-2643-4a80-8d50-82267e2567e0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.819959 4795 generic.go:334] "Generic (PLEG): container finished" podID="5534910c-2643-4a80-8d50-82267e2567e0" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" exitCode=0 Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s45dr" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820060 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a"} Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s45dr" event={"ID":"5534910c-2643-4a80-8d50-82267e2567e0","Type":"ContainerDied","Data":"2ad10827e27678d862a3371f7fc9ea6b3045e4278d126e488871cefd1947de82"} Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.820523 4795 scope.go:117] "RemoveContainer" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.863508 4795 scope.go:117] "RemoveContainer" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.865351 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.873492 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s45dr"] Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.899645 4795 scope.go:117] "RemoveContainer" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.964228 4795 scope.go:117] "RemoveContainer" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" Mar 20 18:25:27 crc kubenswrapper[4795]: E0320 18:25:27.964941 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a\": container with ID starting with 9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a not found: ID does not exist" containerID="9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.964980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a"} err="failed to get container status \"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a\": rpc error: code = NotFound desc = could not find container \"9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a\": container with ID starting with 9566c85f5bb2798a785fc29bd290f1371ebd3860b97acc022cc8b1fb96f86b9a not found: ID does not exist" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965005 4795 scope.go:117] "RemoveContainer" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" Mar 20 18:25:27 crc kubenswrapper[4795]: E0320 18:25:27.965314 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4\": container with ID starting with 1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4 not found: ID does not exist" containerID="1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965338 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4"} err="failed to get container status \"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4\": rpc error: code = NotFound desc = could not find container \"1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4\": container with ID starting with 1d4ef8c8dbabd67fa7e95fd86c778a2b630745f7068c95fba3346e203e9b52a4 not found: ID does not exist" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965352 4795 scope.go:117] "RemoveContainer" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" Mar 20 18:25:27 crc kubenswrapper[4795]: E0320 18:25:27.965571 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6\": container with ID starting with 9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6 not found: ID does not exist" containerID="9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6" Mar 20 18:25:27 crc kubenswrapper[4795]: I0320 18:25:27.965594 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6"} err="failed to get container status \"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6\": rpc error: code = NotFound desc = could not find container \"9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6\": container with ID starting with 9d759f6f94aab3a8f1a3f131a43b3b39c07f98a83aafae114b785b4434825df6 not found: ID does not exist" Mar 20 18:25:29 crc kubenswrapper[4795]: I0320 18:25:29.274170 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5534910c-2643-4a80-8d50-82267e2567e0" path="/var/lib/kubelet/pods/5534910c-2643-4a80-8d50-82267e2567e0/volumes" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.164604 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166393 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166415 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166422 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166441 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166449 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166464 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166470 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166498 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166506 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: E0320 18:26:00.166520 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166741 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3336a777-640c-4ec9-a1f7-27a05d6efe01" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.166764 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5534910c-2643-4a80-8d50-82267e2567e0" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.167571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.170180 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.170231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.170436 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.186654 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.268922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"auto-csr-approver-29567186-l88zk\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.372028 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"auto-csr-approver-29567186-l88zk\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.399459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"auto-csr-approver-29567186-l88zk\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:00 crc kubenswrapper[4795]: I0320 18:26:00.503938 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:01 crc kubenswrapper[4795]: I0320 18:26:01.011584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:26:01 crc kubenswrapper[4795]: I0320 18:26:01.210647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-l88zk" event={"ID":"654cb8e4-7fd7-4e3e-955a-a71906ccfb79","Type":"ContainerStarted","Data":"2dbb1b4ae1e3b69e21cb5fdf19f437788fb04da0d018b660857ad86909e5691e"} Mar 20 18:26:03 crc kubenswrapper[4795]: I0320 18:26:03.232330 4795 generic.go:334] "Generic (PLEG): container finished" podID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerID="43b386b54b2c5ae34c509074586a617552173f2543677683ef8d11caf140f2f9" exitCode=0 Mar 20 18:26:03 crc kubenswrapper[4795]: I0320 18:26:03.232435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-l88zk" event={"ID":"654cb8e4-7fd7-4e3e-955a-a71906ccfb79","Type":"ContainerDied","Data":"43b386b54b2c5ae34c509074586a617552173f2543677683ef8d11caf140f2f9"} Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.654162 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.754463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") pod \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\" (UID: \"654cb8e4-7fd7-4e3e-955a-a71906ccfb79\") " Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.762934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr" (OuterVolumeSpecName: "kube-api-access-6ztgr") pod "654cb8e4-7fd7-4e3e-955a-a71906ccfb79" (UID: "654cb8e4-7fd7-4e3e-955a-a71906ccfb79"). InnerVolumeSpecName "kube-api-access-6ztgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:26:04 crc kubenswrapper[4795]: I0320 18:26:04.856782 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ztgr\" (UniqueName: \"kubernetes.io/projected/654cb8e4-7fd7-4e3e-955a-a71906ccfb79-kube-api-access-6ztgr\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.263387 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-l88zk" Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.286987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-l88zk" event={"ID":"654cb8e4-7fd7-4e3e-955a-a71906ccfb79","Type":"ContainerDied","Data":"2dbb1b4ae1e3b69e21cb5fdf19f437788fb04da0d018b660857ad86909e5691e"} Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.287048 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbb1b4ae1e3b69e21cb5fdf19f437788fb04da0d018b660857ad86909e5691e" Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.719123 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:26:05 crc kubenswrapper[4795]: I0320 18:26:05.727470 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-hdwfg"] Mar 20 18:26:07 crc kubenswrapper[4795]: I0320 18:26:07.260971 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d7fab6-a6ea-4dee-bd81-84a6cfb81aec" path="/var/lib/kubelet/pods/78d7fab6-a6ea-4dee-bd81-84a6cfb81aec/volumes" Mar 20 18:26:11 crc kubenswrapper[4795]: I0320 18:26:11.719702 4795 scope.go:117] "RemoveContainer" containerID="fa6146c0ab5a972766bc66e4407b6570440e104ac1129b01c598c47dc5cc0faf" Mar 20 18:26:53 crc kubenswrapper[4795]: I0320 18:26:53.768118 4795 generic.go:334] "Generic (PLEG): container finished" podID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerID="ca1e86805a9f6b3f6807f721075e3f792e3f51254780ac13719c7eec007f4373" exitCode=0 Mar 20 18:26:53 crc kubenswrapper[4795]: I0320 18:26:53.768238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerDied","Data":"ca1e86805a9f6b3f6807f721075e3f792e3f51254780ac13719c7eec007f4373"} Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.211393 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.321984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322310 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322343 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.322988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323042 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323113 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") pod \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\" (UID: \"caaf60a5-8c45-4831-8d26-8cf808f1da7a\") " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323683 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.323725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data" (OuterVolumeSpecName: "config-data") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.333259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.334073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.334270 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw" (OuterVolumeSpecName: "kube-api-access-w88sw") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "kube-api-access-w88sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.354578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.361633 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.374806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.385827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "caaf60a5-8c45-4831-8d26-8cf808f1da7a" (UID: "caaf60a5-8c45-4831-8d26-8cf808f1da7a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425525 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425759 4795 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425849 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.425930 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w88sw\" (UniqueName: \"kubernetes.io/projected/caaf60a5-8c45-4831-8d26-8cf808f1da7a-kube-api-access-w88sw\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.426015 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/caaf60a5-8c45-4831-8d26-8cf808f1da7a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.426108 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.429732 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.429859 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/caaf60a5-8c45-4831-8d26-8cf808f1da7a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.430122 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/caaf60a5-8c45-4831-8d26-8cf808f1da7a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.445070 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.531612 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.792075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"caaf60a5-8c45-4831-8d26-8cf808f1da7a","Type":"ContainerDied","Data":"7bdfe7f881d951a74ca8b66b0f91841dff449bc239ef1c2b7c679ee61596377d"} Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.792125 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bdfe7f881d951a74ca8b66b0f91841dff449bc239ef1c2b7c679ee61596377d" Mar 20 18:26:55 crc kubenswrapper[4795]: I0320 18:26:55.792249 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.030021 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:27:00 crc kubenswrapper[4795]: E0320 18:27:00.030876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.030890 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:27:00 crc kubenswrapper[4795]: E0320 18:27:00.030912 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerName="oc" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.030919 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerName="oc" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.031143 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" containerName="oc" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.031174 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaf60a5-8c45-4831-8d26-8cf808f1da7a" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.031797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.034346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zgwjr" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.052849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.121184 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.121312 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54b6h\" (UniqueName: \"kubernetes.io/projected/3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0-kube-api-access-54b6h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.222968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54b6h\" (UniqueName: \"kubernetes.io/projected/3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0-kube-api-access-54b6h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.223134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.223662 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.245408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54b6h\" (UniqueName: \"kubernetes.io/projected/3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0-kube-api-access-54b6h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.259576 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.350113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.786018 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:27:00 crc kubenswrapper[4795]: I0320 18:27:00.837761 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0","Type":"ContainerStarted","Data":"02466a550805e88a2a340de106b3d33013005661fbbe664ff91ad333baba735b"} Mar 20 18:27:02 crc kubenswrapper[4795]: I0320 18:27:02.863750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0","Type":"ContainerStarted","Data":"e60aa9f2ea835d15178a6c165de56d214c6d87950df63a46486e65f878f732ad"} Mar 20 18:27:02 crc kubenswrapper[4795]: I0320 18:27:02.889853 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.216400545 podStartE2EDuration="2.889832532s" podCreationTimestamp="2026-03-20 18:27:00 +0000 UTC" firstStartedPulling="2026-03-20 18:27:00.799375512 +0000 UTC m=+4164.257407063" lastFinishedPulling="2026-03-20 18:27:02.472807509 +0000 UTC m=+4165.930839050" observedRunningTime="2026-03-20 18:27:02.881270046 +0000 UTC m=+4166.339301597" watchObservedRunningTime="2026-03-20 18:27:02.889832532 +0000 UTC m=+4166.347864083" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.831611 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.834200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.835555 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n2wfg"/"openshift-service-ca.crt" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.835946 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n2wfg"/"kube-root-ca.crt" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.836057 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n2wfg"/"default-dockercfg-khqtb" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.845269 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.956861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:27 crc kubenswrapper[4795]: I0320 18:27:27.956908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.058911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.058959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.059486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.078579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"must-gather-gb8cc\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.151591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:27:28 crc kubenswrapper[4795]: I0320 18:27:28.605887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:27:29 crc kubenswrapper[4795]: I0320 18:27:29.131059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerStarted","Data":"6b5dd8edc9985b61820c61f74a3a75491202da230357733de6439e66a17b3693"} Mar 20 18:27:34 crc kubenswrapper[4795]: I0320 18:27:34.181526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerStarted","Data":"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11"} Mar 20 18:27:34 crc kubenswrapper[4795]: I0320 18:27:34.182187 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerStarted","Data":"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553"} Mar 20 18:27:34 crc kubenswrapper[4795]: I0320 18:27:34.200184 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" podStartSLOduration=2.9078369840000002 podStartE2EDuration="7.200159292s" podCreationTimestamp="2026-03-20 18:27:27 +0000 UTC" firstStartedPulling="2026-03-20 18:27:28.873988848 +0000 UTC m=+4192.332020429" lastFinishedPulling="2026-03-20 18:27:33.166311206 +0000 UTC m=+4196.624342737" observedRunningTime="2026-03-20 18:27:34.197115027 +0000 UTC m=+4197.655146578" watchObservedRunningTime="2026-03-20 18:27:34.200159292 +0000 UTC m=+4197.658190873" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.742885 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-97766"] Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.745084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.759998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.760308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.861914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.862068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.862277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:37 crc kubenswrapper[4795]: I0320 18:27:37.888488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"crc-debug-97766\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:38 crc kubenswrapper[4795]: I0320 18:27:38.066896 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:27:38 crc kubenswrapper[4795]: I0320 18:27:38.238521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-97766" event={"ID":"632805f2-aba6-41af-89dd-8b176af4ab77","Type":"ContainerStarted","Data":"c6ea7c0685ab870ea1880247614497aae537421fa8bfa8af0cdffd21865b427c"} Mar 20 18:27:41 crc kubenswrapper[4795]: I0320 18:27:41.300490 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:27:41 crc kubenswrapper[4795]: I0320 18:27:41.302655 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.291705 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.294919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.300205 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.411234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.411581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.411656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.514833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.536246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"certified-operators-bhjxn\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:45 crc kubenswrapper[4795]: I0320 18:27:45.627635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:47 crc kubenswrapper[4795]: I0320 18:27:47.875589 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.340969 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" exitCode=0 Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.341063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4"} Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.341379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerStarted","Data":"4ccd8707afcd417a2b4fcfee27edadddc134eab632522b4d93bf84758023574e"} Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.345063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-97766" event={"ID":"632805f2-aba6-41af-89dd-8b176af4ab77","Type":"ContainerStarted","Data":"ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8"} Mar 20 18:27:48 crc kubenswrapper[4795]: I0320 18:27:48.385675 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2wfg/crc-debug-97766" podStartSLOduration=2.033673474 podStartE2EDuration="11.385644383s" podCreationTimestamp="2026-03-20 18:27:37 +0000 UTC" firstStartedPulling="2026-03-20 18:27:38.123292811 +0000 UTC m=+4201.581324352" lastFinishedPulling="2026-03-20 18:27:47.47526372 +0000 UTC m=+4210.933295261" observedRunningTime="2026-03-20 18:27:48.380585967 +0000 UTC m=+4211.838617508" watchObservedRunningTime="2026-03-20 18:27:48.385644383 +0000 UTC m=+4211.843675924" Mar 20 18:27:49 crc kubenswrapper[4795]: I0320 18:27:49.370874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerStarted","Data":"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1"} Mar 20 18:27:50 crc kubenswrapper[4795]: I0320 18:27:50.378361 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" exitCode=0 Mar 20 18:27:50 crc kubenswrapper[4795]: I0320 18:27:50.378523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1"} Mar 20 18:27:51 crc kubenswrapper[4795]: I0320 18:27:51.392517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerStarted","Data":"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f"} Mar 20 18:27:51 crc kubenswrapper[4795]: I0320 18:27:51.423647 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bhjxn" podStartSLOduration=3.990576168 podStartE2EDuration="6.423625691s" podCreationTimestamp="2026-03-20 18:27:45 +0000 UTC" firstStartedPulling="2026-03-20 18:27:48.344392533 +0000 UTC m=+4211.802424074" lastFinishedPulling="2026-03-20 18:27:50.777442046 +0000 UTC m=+4214.235473597" observedRunningTime="2026-03-20 18:27:51.419538024 +0000 UTC m=+4214.877569585" watchObservedRunningTime="2026-03-20 18:27:51.423625691 +0000 UTC m=+4214.881657232" Mar 20 18:27:55 crc kubenswrapper[4795]: I0320 18:27:55.628709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:55 crc kubenswrapper[4795]: I0320 18:27:55.630191 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:55 crc kubenswrapper[4795]: I0320 18:27:55.675923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:56 crc kubenswrapper[4795]: I0320 18:27:56.532376 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:56 crc kubenswrapper[4795]: I0320 18:27:56.589396 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:58 crc kubenswrapper[4795]: I0320 18:27:58.470206 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bhjxn" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" containerID="cri-o://5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" gracePeriod=2 Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.393551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481121 4795 generic.go:334] "Generic (PLEG): container finished" podID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" exitCode=0 Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481160 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f"} Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhjxn" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhjxn" event={"ID":"c9c595cc-573a-4c5d-95f9-48d3e0289c6f","Type":"ContainerDied","Data":"4ccd8707afcd417a2b4fcfee27edadddc134eab632522b4d93bf84758023574e"} Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.481202 4795 scope.go:117] "RemoveContainer" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.483759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") pod \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.483840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") pod \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.483935 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") pod \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\" (UID: \"c9c595cc-573a-4c5d-95f9-48d3e0289c6f\") " Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.484607 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities" (OuterVolumeSpecName: "utilities") pod "c9c595cc-573a-4c5d-95f9-48d3e0289c6f" (UID: "c9c595cc-573a-4c5d-95f9-48d3e0289c6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.531431 4795 scope.go:117] "RemoveContainer" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.560727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9c595cc-573a-4c5d-95f9-48d3e0289c6f" (UID: "c9c595cc-573a-4c5d-95f9-48d3e0289c6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.565989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt" (OuterVolumeSpecName: "kube-api-access-v66dt") pod "c9c595cc-573a-4c5d-95f9-48d3e0289c6f" (UID: "c9c595cc-573a-4c5d-95f9-48d3e0289c6f"). InnerVolumeSpecName "kube-api-access-v66dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.580820 4795 scope.go:117] "RemoveContainer" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.586126 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.586157 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v66dt\" (UniqueName: \"kubernetes.io/projected/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-kube-api-access-v66dt\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.586169 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9c595cc-573a-4c5d-95f9-48d3e0289c6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.598287 4795 scope.go:117] "RemoveContainer" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" Mar 20 18:27:59 crc kubenswrapper[4795]: E0320 18:27:59.600408 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f\": container with ID starting with 5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f not found: ID does not exist" containerID="5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.600505 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f"} err="failed to get container status \"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f\": rpc error: code = NotFound desc = could not find container \"5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f\": container with ID starting with 5224a12e3f58a53837b4c1f0d19f3cc0a1d427d21115738c47b4de6a520f876f not found: ID does not exist" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.600565 4795 scope.go:117] "RemoveContainer" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" Mar 20 18:27:59 crc kubenswrapper[4795]: E0320 18:27:59.601045 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1\": container with ID starting with 9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1 not found: ID does not exist" containerID="9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.601083 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1"} err="failed to get container status \"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1\": rpc error: code = NotFound desc = could not find container \"9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1\": container with ID starting with 9be0f16d55170f39e2a50188cc4f73866ef1b65b20d4ce808c392b7ce1811dc1 not found: ID does not exist" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.601104 4795 scope.go:117] "RemoveContainer" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" Mar 20 18:27:59 crc kubenswrapper[4795]: E0320 18:27:59.603534 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4\": container with ID starting with 1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4 not found: ID does not exist" containerID="1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.603573 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4"} err="failed to get container status \"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4\": rpc error: code = NotFound desc = could not find container \"1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4\": container with ID starting with 1cb8de870eada2139ec986c343a83d9f309a412eb93e7581eaec97d306104df4 not found: ID does not exist" Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.817066 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:27:59 crc kubenswrapper[4795]: I0320 18:27:59.825702 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bhjxn"] Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.151627 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:28:00 crc kubenswrapper[4795]: E0320 18:28:00.152364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-utilities" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152382 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-utilities" Mar 20 18:28:00 crc kubenswrapper[4795]: E0320 18:28:00.152397 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-content" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="extract-content" Mar 20 18:28:00 crc kubenswrapper[4795]: E0320 18:28:00.152413 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152420 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.152593 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" containerName="registry-server" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.153196 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.156428 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.156435 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.157483 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.165413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.301821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"auto-csr-approver-29567188-97rr4\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.405575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"auto-csr-approver-29567188-97rr4\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.448646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"auto-csr-approver-29567188-97rr4\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.476522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:00 crc kubenswrapper[4795]: I0320 18:28:00.938496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:28:01 crc kubenswrapper[4795]: I0320 18:28:01.262086 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c595cc-573a-4c5d-95f9-48d3e0289c6f" path="/var/lib/kubelet/pods/c9c595cc-573a-4c5d-95f9-48d3e0289c6f/volumes" Mar 20 18:28:01 crc kubenswrapper[4795]: I0320 18:28:01.518137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-97rr4" event={"ID":"551830bd-5613-42fb-b4ad-b1c6c6a0b09c","Type":"ContainerStarted","Data":"68f9e7c7ac5591cc6f6011c1b081e3f974b165cad8cd260e55cb3eea35ddf0b9"} Mar 20 18:28:03 crc kubenswrapper[4795]: I0320 18:28:03.535270 4795 generic.go:334] "Generic (PLEG): container finished" podID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerID="29253cb593d65e36df8393e9b2e7d2df325902972a57f35ad5e0d8767eaa777e" exitCode=0 Mar 20 18:28:03 crc kubenswrapper[4795]: I0320 18:28:03.535731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-97rr4" event={"ID":"551830bd-5613-42fb-b4ad-b1c6c6a0b09c","Type":"ContainerDied","Data":"29253cb593d65e36df8393e9b2e7d2df325902972a57f35ad5e0d8767eaa777e"} Mar 20 18:28:05 crc kubenswrapper[4795]: I0320 18:28:05.802856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:05 crc kubenswrapper[4795]: I0320 18:28:05.904662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") pod \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\" (UID: \"551830bd-5613-42fb-b4ad-b1c6c6a0b09c\") " Mar 20 18:28:05 crc kubenswrapper[4795]: I0320 18:28:05.910971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v" (OuterVolumeSpecName: "kube-api-access-lct6v") pod "551830bd-5613-42fb-b4ad-b1c6c6a0b09c" (UID: "551830bd-5613-42fb-b4ad-b1c6c6a0b09c"). InnerVolumeSpecName "kube-api-access-lct6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.006977 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lct6v\" (UniqueName: \"kubernetes.io/projected/551830bd-5613-42fb-b4ad-b1c6c6a0b09c-kube-api-access-lct6v\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.566245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-97rr4" event={"ID":"551830bd-5613-42fb-b4ad-b1c6c6a0b09c","Type":"ContainerDied","Data":"68f9e7c7ac5591cc6f6011c1b081e3f974b165cad8cd260e55cb3eea35ddf0b9"} Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.566286 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f9e7c7ac5591cc6f6011c1b081e3f974b165cad8cd260e55cb3eea35ddf0b9" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.566362 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-97rr4" Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.876865 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:28:06 crc kubenswrapper[4795]: I0320 18:28:06.884670 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-5tldr"] Mar 20 18:28:07 crc kubenswrapper[4795]: I0320 18:28:07.263734 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478a3729-f417-4458-b0c5-562ed9c72252" path="/var/lib/kubelet/pods/478a3729-f417-4458-b0c5-562ed9c72252/volumes" Mar 20 18:28:11 crc kubenswrapper[4795]: I0320 18:28:11.299938 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:28:11 crc kubenswrapper[4795]: I0320 18:28:11.300494 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:28:11 crc kubenswrapper[4795]: I0320 18:28:11.899190 4795 scope.go:117] "RemoveContainer" containerID="693c3bcc8927765191605aa2e63974d17e30d9c34ae2ebaf265ed06522fb9c6b" Mar 20 18:28:34 crc kubenswrapper[4795]: I0320 18:28:34.822223 4795 generic.go:334] "Generic (PLEG): container finished" podID="632805f2-aba6-41af-89dd-8b176af4ab77" containerID="ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8" exitCode=0 Mar 20 18:28:34 crc kubenswrapper[4795]: I0320 18:28:34.822302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-97766" event={"ID":"632805f2-aba6-41af-89dd-8b176af4ab77","Type":"ContainerDied","Data":"ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8"} Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.942285 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.976044 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-97766"] Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.986363 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-97766"] Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989266 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") pod \"632805f2-aba6-41af-89dd-8b176af4ab77\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") pod \"632805f2-aba6-41af-89dd-8b176af4ab77\" (UID: \"632805f2-aba6-41af-89dd-8b176af4ab77\") " Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989417 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host" (OuterVolumeSpecName: "host") pod "632805f2-aba6-41af-89dd-8b176af4ab77" (UID: "632805f2-aba6-41af-89dd-8b176af4ab77"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.989851 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/632805f2-aba6-41af-89dd-8b176af4ab77-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:35 crc kubenswrapper[4795]: I0320 18:28:35.995957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b" (OuterVolumeSpecName: "kube-api-access-9fk5b") pod "632805f2-aba6-41af-89dd-8b176af4ab77" (UID: "632805f2-aba6-41af-89dd-8b176af4ab77"). InnerVolumeSpecName "kube-api-access-9fk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:36 crc kubenswrapper[4795]: I0320 18:28:36.092114 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fk5b\" (UniqueName: \"kubernetes.io/projected/632805f2-aba6-41af-89dd-8b176af4ab77-kube-api-access-9fk5b\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:36 crc kubenswrapper[4795]: I0320 18:28:36.842419 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ea7c0685ab870ea1880247614497aae537421fa8bfa8af0cdffd21865b427c" Mar 20 18:28:36 crc kubenswrapper[4795]: I0320 18:28:36.842464 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-97766" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.163629 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-zjcbs"] Mar 20 18:28:37 crc kubenswrapper[4795]: E0320 18:28:37.164100 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" containerName="container-00" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164114 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" containerName="container-00" Mar 20 18:28:37 crc kubenswrapper[4795]: E0320 18:28:37.164161 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerName="oc" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164170 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerName="oc" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164349 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" containerName="container-00" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164364 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" containerName="oc" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.164930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.210585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.210744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.262879 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632805f2-aba6-41af-89dd-8b176af4ab77" path="/var/lib/kubelet/pods/632805f2-aba6-41af-89dd-8b176af4ab77/volumes" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.312286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.312448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.312877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.366117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"crc-debug-zjcbs\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.483811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.851354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" event={"ID":"6a07434f-3d3f-4d03-b2e8-007a9df2f23c","Type":"ContainerStarted","Data":"740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3"} Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.851621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" event={"ID":"6a07434f-3d3f-4d03-b2e8-007a9df2f23c","Type":"ContainerStarted","Data":"2afcdeda4ec6392500e172a85f0874653e75d770ea9801b8d4a0b73ca4a4256e"} Mar 20 18:28:37 crc kubenswrapper[4795]: I0320 18:28:37.865941 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" podStartSLOduration=0.865924309 podStartE2EDuration="865.924309ms" podCreationTimestamp="2026-03-20 18:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:28:37.863191754 +0000 UTC m=+4261.321223285" watchObservedRunningTime="2026-03-20 18:28:37.865924309 +0000 UTC m=+4261.323955850" Mar 20 18:28:38 crc kubenswrapper[4795]: I0320 18:28:38.859401 4795 generic.go:334] "Generic (PLEG): container finished" podID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerID="740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3" exitCode=0 Mar 20 18:28:38 crc kubenswrapper[4795]: I0320 18:28:38.859445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" event={"ID":"6a07434f-3d3f-4d03-b2e8-007a9df2f23c","Type":"ContainerDied","Data":"740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3"} Mar 20 18:28:39 crc kubenswrapper[4795]: I0320 18:28:39.990451 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.087172 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-zjcbs"] Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.098245 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-zjcbs"] Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.152335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") pod \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.152449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host" (OuterVolumeSpecName: "host") pod "6a07434f-3d3f-4d03-b2e8-007a9df2f23c" (UID: "6a07434f-3d3f-4d03-b2e8-007a9df2f23c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.152570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") pod \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\" (UID: \"6a07434f-3d3f-4d03-b2e8-007a9df2f23c\") " Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.153173 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.163942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b" (OuterVolumeSpecName: "kube-api-access-jxr7b") pod "6a07434f-3d3f-4d03-b2e8-007a9df2f23c" (UID: "6a07434f-3d3f-4d03-b2e8-007a9df2f23c"). InnerVolumeSpecName "kube-api-access-jxr7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.254459 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxr7b\" (UniqueName: \"kubernetes.io/projected/6a07434f-3d3f-4d03-b2e8-007a9df2f23c-kube-api-access-jxr7b\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.891344 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2afcdeda4ec6392500e172a85f0874653e75d770ea9801b8d4a0b73ca4a4256e" Mar 20 18:28:40 crc kubenswrapper[4795]: I0320 18:28:40.891604 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-zjcbs" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.263727 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" path="/var/lib/kubelet/pods/6a07434f-3d3f-4d03-b2e8-007a9df2f23c/volumes" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.299897 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.299981 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.300049 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.301190 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.301294 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc" gracePeriod=600 Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.318951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-jrdxp"] Mar 20 18:28:41 crc kubenswrapper[4795]: E0320 18:28:41.319459 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerName="container-00" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.319544 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerName="container-00" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.319831 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a07434f-3d3f-4d03-b2e8-007a9df2f23c" containerName="container-00" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.320766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.509775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.509855 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.611669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.612032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.612172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.652433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"crc-debug-jrdxp\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902598 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc" exitCode=0 Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc"} Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761"} Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.902743 4795 scope.go:117] "RemoveContainer" containerID="7cfdec97b4363ea4e176376f63dd09cd023a264cb0066da44d868b0e79d798a8" Mar 20 18:28:41 crc kubenswrapper[4795]: I0320 18:28:41.947352 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:41 crc kubenswrapper[4795]: W0320 18:28:41.984398 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44ed4b12_46ff_46fd_b451_308fec6fda3d.slice/crio-f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4 WatchSource:0}: Error finding container f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4: Status 404 returned error can't find the container with id f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4 Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.911086 4795 generic.go:334] "Generic (PLEG): container finished" podID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerID="cd9a82179132e7ce391f9016c2a0e0e1a65591e62d7440ada8662bd6a235be65" exitCode=0 Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.911156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" event={"ID":"44ed4b12-46ff-46fd-b451-308fec6fda3d","Type":"ContainerDied","Data":"cd9a82179132e7ce391f9016c2a0e0e1a65591e62d7440ada8662bd6a235be65"} Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.911591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" event={"ID":"44ed4b12-46ff-46fd-b451-308fec6fda3d","Type":"ContainerStarted","Data":"f1bdff2f5ff3211261939aeba7b33aa9b346dad647a5c207f455199bb02aabe4"} Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.974114 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-jrdxp"] Mar 20 18:28:42 crc kubenswrapper[4795]: I0320 18:28:42.985506 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/crc-debug-jrdxp"] Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.038997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.155814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") pod \"44ed4b12-46ff-46fd-b451-308fec6fda3d\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.155988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") pod \"44ed4b12-46ff-46fd-b451-308fec6fda3d\" (UID: \"44ed4b12-46ff-46fd-b451-308fec6fda3d\") " Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.156154 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host" (OuterVolumeSpecName: "host") pod "44ed4b12-46ff-46fd-b451-308fec6fda3d" (UID: "44ed4b12-46ff-46fd-b451-308fec6fda3d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.156563 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44ed4b12-46ff-46fd-b451-308fec6fda3d-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.165376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g" (OuterVolumeSpecName: "kube-api-access-sg48g") pod "44ed4b12-46ff-46fd-b451-308fec6fda3d" (UID: "44ed4b12-46ff-46fd-b451-308fec6fda3d"). InnerVolumeSpecName "kube-api-access-sg48g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.259420 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg48g\" (UniqueName: \"kubernetes.io/projected/44ed4b12-46ff-46fd-b451-308fec6fda3d-kube-api-access-sg48g\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.956335 4795 scope.go:117] "RemoveContainer" containerID="cd9a82179132e7ce391f9016c2a0e0e1a65591e62d7440ada8662bd6a235be65" Mar 20 18:28:44 crc kubenswrapper[4795]: I0320 18:28:44.956532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/crc-debug-jrdxp" Mar 20 18:28:45 crc kubenswrapper[4795]: I0320 18:28:45.267241 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" path="/var/lib/kubelet/pods/44ed4b12-46ff-46fd-b451-308fec6fda3d/volumes" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.634760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api/0.log" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.722794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api-log/0.log" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.883255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener/0.log" Mar 20 18:29:12 crc kubenswrapper[4795]: I0320 18:29:12.900563 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener-log/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.006197 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.074540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker-log/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.348299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-central-agent/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.446673 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-notification-agent/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.471445 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/proxy-httpd/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.524883 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-65dps_0708214e-e711-465a-a54e-97a462b2777e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.539113 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/sg-core/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.711650 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api/0.log" Mar 20 18:29:13 crc kubenswrapper[4795]: I0320 18:29:13.737718 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api-log/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.033669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/cinder-scheduler/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.076401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/probe/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.391119 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hprm9_2bad20c9-d77a-4c30-8fa2-979c05697cf4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.581511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.713777 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm_3d666090-1065-4b2d-9ac6-b84776b53d0a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.790759 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:29:14 crc kubenswrapper[4795]: I0320 18:29:14.970874 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/dnsmasq-dns/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.100323 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k_b0af5324-4ba3-4a12-9fdb-b467918ba19d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.110799 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-httpd/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.197531 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-log/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.351438 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-httpd/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.412607 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-log/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.659879 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon/0.log" Mar 20 18:29:15 crc kubenswrapper[4795]: I0320 18:29:15.814290 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5_0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.023004 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon-log/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.378564 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-t26vc_cdfe5ffc-ab15-4277-966f-f506e725e8b1/keystone-cron/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.659727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_72605c7d-99df-450f-900b-3022b0520149/kube-state-metrics/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.881706 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5rj55_20b330a0-830c-419e-81fe-a36dd1a32cc2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:16 crc kubenswrapper[4795]: I0320 18:29:16.983854 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85b996ff68-fdzxg_7b20a034-11f6-40ad-9447-32c49f705c07/keystone-api/0.log" Mar 20 18:29:17 crc kubenswrapper[4795]: I0320 18:29:17.713177 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-httpd/0.log" Mar 20 18:29:17 crc kubenswrapper[4795]: I0320 18:29:17.837281 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-api/0.log" Mar 20 18:29:17 crc kubenswrapper[4795]: I0320 18:29:17.937068 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7_e29f4857-ff0d-4806-ba09-74448200e8e2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.151385 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rp77q_b6da9d2a-e18f-4994-b8f3-6b1eb969564b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.235449 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-log/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.670423 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5916e4d2-2863-4088-be97-cf368906820b/nova-cell0-conductor-conductor/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.715615 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-api/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.727084 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_19c15c93-572c-4d53-b924-172f3ad29c8a/nova-cell1-conductor-conductor/0.log" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.795451 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:19 crc kubenswrapper[4795]: E0320 18:29:19.795954 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerName="container-00" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.795975 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerName="container-00" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.796513 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ed4b12-46ff-46fd-b451-308fec6fda3d" containerName="container-00" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.797946 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.849944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.850054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.850113 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.869722 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.951262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.951328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.951372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.952041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:19 crc kubenswrapper[4795]: I0320 18:29:19.952057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.014501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d2a5e398-6d25-43b1-8c29-407af2d9348b/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.507093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"redhat-marketplace-27vc2\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.765406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:20 crc kubenswrapper[4795]: I0320 18:29:20.911483 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-log/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.225800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.317240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerStarted","Data":"f48d14fe770b6f14c5b8978e7e896e405febd9356d726a808ae98af698bf5a4d"} Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.444931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.527893 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-metadata/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.691650 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.698141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c23f56ff-eceb-4891-87e5-57ebeb7eba8d/nova-scheduler-scheduler/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.735829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/galera/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.924947 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:29:21 crc kubenswrapper[4795]: I0320 18:29:21.999170 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-kc4wx_709f5080-c511-4d3b-bc9c-baeec85fa245/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.080234 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.163369 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/galera/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.213922 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_cf3f8aea-393e-418a-ad14-2848c8df93e9/openstackclient/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.333276 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" exitCode=0 Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.333318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1"} Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.337346 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.411613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dnp2g_28df10bb-d6a9-47a9-9b79-0bb9665529ef/ovn-controller/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.420781 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n4gzx_85004117-20bc-474e-88f5-ce49032749ff/openstack-network-exporter/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.660177 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.804315 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.852023 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovs-vswitchd/0.log" Mar 20 18:29:22 crc kubenswrapper[4795]: I0320 18:29:22.853566 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.056892 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/openstack-network-exporter/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.164923 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/ovn-northd/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.320220 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/openstack-network-exporter/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.366467 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9jw45_6c737290-0616-475b-a839-cca387d8d90d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.375530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/ovsdbserver-nb/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.557409 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/openstack-network-exporter/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.560966 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/ovsdbserver-sb/0.log" Mar 20 18:29:23 crc kubenswrapper[4795]: I0320 18:29:23.903126 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.053602 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-api/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.086103 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-log/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.089613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.210028 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/rabbitmq/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.326892 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.353586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerStarted","Data":"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20"} Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.483966 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.537744 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/rabbitmq/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.583293 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88_1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.804598 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tx6d9_d7dc5d37-6d24-48ea-acc1-2b4ed3de6936/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:24 crc kubenswrapper[4795]: I0320 18:29:24.845832 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk_e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.087231 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j6rls_80cf5a83-936d-4789-a7bc-b91cdb0e564d/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.112726 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-prfq6_9cdb4943-60a1-41cc-aead-1702a4c1f68a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.140166 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ca95ec62-fce9-4c91-bb59-fa80f512edba/memcached/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.302540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.360164 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-httpd/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.364125 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" exitCode=0 Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.364174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20"} Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.373438 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m8zw5_2c422574-0103-4c97-9e23-5a78c5b44e69/swift-ring-rebalance/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.529733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-auditor/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.571803 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-reaper/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.609394 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.639335 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-replicator/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.698898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-auditor/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.749168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-replicator/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.783059 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.797243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-updater/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.856829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-auditor/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.941483 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-expirer/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.960810 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-replicator/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.975374 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-server/0.log" Mar 20 18:29:25 crc kubenswrapper[4795]: I0320 18:29:25.986617 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-updater/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.074138 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/rsync/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.115304 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/swift-recon-cron/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.299613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_caaf60a5-8c45-4831-8d26-8cf808f1da7a/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.372940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerStarted","Data":"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000"} Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.397711 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27vc2" podStartSLOduration=3.742151994 podStartE2EDuration="7.397677217s" podCreationTimestamp="2026-03-20 18:29:19 +0000 UTC" firstStartedPulling="2026-03-20 18:29:22.337121263 +0000 UTC m=+4305.795152804" lastFinishedPulling="2026-03-20 18:29:25.992646486 +0000 UTC m=+4309.450678027" observedRunningTime="2026-03-20 18:29:26.389836754 +0000 UTC m=+4309.847868295" watchObservedRunningTime="2026-03-20 18:29:26.397677217 +0000 UTC m=+4309.855708758" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.462136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0/test-operator-logs-container/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.617375 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5_35b4aa82-d668-474b-b54d-b540190f5a6c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:26 crc kubenswrapper[4795]: I0320 18:29:26.864168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh_d519d04c-89f1-46b7-8136-1a9596af73ac/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:29:30 crc kubenswrapper[4795]: I0320 18:29:30.766127 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:30 crc kubenswrapper[4795]: I0320 18:29:30.766642 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:30 crc kubenswrapper[4795]: I0320 18:29:30.814997 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:31 crc kubenswrapper[4795]: I0320 18:29:31.515325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:31 crc kubenswrapper[4795]: I0320 18:29:31.570368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:33 crc kubenswrapper[4795]: I0320 18:29:33.448460 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27vc2" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" containerID="cri-o://57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" gracePeriod=2 Mar 20 18:29:33 crc kubenswrapper[4795]: I0320 18:29:33.943453 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.035357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") pod \"5d33b616-99f6-473a-8114-0203d0f7e9fb\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.035413 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") pod \"5d33b616-99f6-473a-8114-0203d0f7e9fb\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.035436 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") pod \"5d33b616-99f6-473a-8114-0203d0f7e9fb\" (UID: \"5d33b616-99f6-473a-8114-0203d0f7e9fb\") " Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.036603 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities" (OuterVolumeSpecName: "utilities") pod "5d33b616-99f6-473a-8114-0203d0f7e9fb" (UID: "5d33b616-99f6-473a-8114-0203d0f7e9fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.041108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7" (OuterVolumeSpecName: "kube-api-access-gzzv7") pod "5d33b616-99f6-473a-8114-0203d0f7e9fb" (UID: "5d33b616-99f6-473a-8114-0203d0f7e9fb"). InnerVolumeSpecName "kube-api-access-gzzv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.063548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d33b616-99f6-473a-8114-0203d0f7e9fb" (UID: "5d33b616-99f6-473a-8114-0203d0f7e9fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.137673 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.137731 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzzv7\" (UniqueName: \"kubernetes.io/projected/5d33b616-99f6-473a-8114-0203d0f7e9fb-kube-api-access-gzzv7\") on node \"crc\" DevicePath \"\"" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.137744 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d33b616-99f6-473a-8114-0203d0f7e9fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.459783 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" exitCode=0 Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.459834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27vc2" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.459853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000"} Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.461071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27vc2" event={"ID":"5d33b616-99f6-473a-8114-0203d0f7e9fb","Type":"ContainerDied","Data":"f48d14fe770b6f14c5b8978e7e896e405febd9356d726a808ae98af698bf5a4d"} Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.461101 4795 scope.go:117] "RemoveContainer" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.481489 4795 scope.go:117] "RemoveContainer" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.500815 4795 scope.go:117] "RemoveContainer" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.505011 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.513532 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27vc2"] Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.551415 4795 scope.go:117] "RemoveContainer" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" Mar 20 18:29:34 crc kubenswrapper[4795]: E0320 18:29:34.551945 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000\": container with ID starting with 57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000 not found: ID does not exist" containerID="57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.551989 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000"} err="failed to get container status \"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000\": rpc error: code = NotFound desc = could not find container \"57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000\": container with ID starting with 57b4ba98ccc54bdf917d9eb3f6bede537917d27efa54409d254a7c85858fe000 not found: ID does not exist" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552017 4795 scope.go:117] "RemoveContainer" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" Mar 20 18:29:34 crc kubenswrapper[4795]: E0320 18:29:34.552459 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20\": container with ID starting with 8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20 not found: ID does not exist" containerID="8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552488 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20"} err="failed to get container status \"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20\": rpc error: code = NotFound desc = could not find container \"8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20\": container with ID starting with 8fb699d64f879a30f9e666d8f35089f75becb101b9a0c7a63b5fc3567ef19e20 not found: ID does not exist" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552508 4795 scope.go:117] "RemoveContainer" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" Mar 20 18:29:34 crc kubenswrapper[4795]: E0320 18:29:34.552754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1\": container with ID starting with 5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1 not found: ID does not exist" containerID="5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1" Mar 20 18:29:34 crc kubenswrapper[4795]: I0320 18:29:34.552793 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1"} err="failed to get container status \"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1\": rpc error: code = NotFound desc = could not find container \"5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1\": container with ID starting with 5f01bf111be333e2fea04bfbfd481e28755dcb75d42fd5a33bda10a29748b5f1 not found: ID does not exist" Mar 20 18:29:35 crc kubenswrapper[4795]: I0320 18:29:35.261265 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" path="/var/lib/kubelet/pods/5d33b616-99f6-473a-8114-0203d0f7e9fb/volumes" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.351945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.534145 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.545764 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.591137 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.764663 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.767177 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/extract/0.log" Mar 20 18:29:52 crc kubenswrapper[4795]: I0320 18:29:52.773317 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.064141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-5hzvs_afefdb79-bad6-4deb-904b-515174cca414/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.190544 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-jgs27_43804d6b-2358-46fd-bf04-26b2308f8ab0/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.366886 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-dwx6n_a957ef3d-357c-4aa4-865c-533f889257d7/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.527287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rmcrf_4cdd16c5-b7d3-4c52-a286-f3555daf43d9/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.678987 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-f74p9_ded84ba8-d70a-4379-bc80-d142e5306cc7/manager/0.log" Mar 20 18:29:53 crc kubenswrapper[4795]: I0320 18:29:53.993744 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-55vp5_9cba9cd3-4144-4262-82a2-f2330793aae6/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.259659 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-6hsxn_84901a7b-ddbf-47d9-954f-c167cd9cd46c/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.403733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6f8b7f6fdf-lrjfh_fc0f2e63-50dd-424e-af01-3d09c9edd5b3/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.482086 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-trjt4_7a887d91-fa86-45d2-a6be-aa7326f7d544/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.662486 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-jfdzb_071f0af8-4164-4f95-b0ee-720e3b3097f3/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.802213 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-bqzcz_0ffe016b-8919-4b8f-839c-669637b7accc/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.821545 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-h9f9t_21481bba-04ec-47ce-95d0-fe27787a3d62/manager/0.log" Mar 20 18:29:54 crc kubenswrapper[4795]: I0320 18:29:54.958958 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5v5sg_0da03e08-561c-4b5f-89c7-af80c8f39f54/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.068913 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-n7cl7_d4ff6977-1303-4267-983e-3e99935f2aae/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.158433 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f557zsq_a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.348401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b67cc5c9-vm29j_084071f5-e58b-451b-9cf5-67203ae1ba02/operator/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.583128 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b6ckg_3aeffd27-d2c7-4744-8e01-07a4db74597e/registry-server/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.779277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-dtfmz_84a19583-b173-4fb9-8b83-d9c41a5faf79/manager/0.log" Mar 20 18:29:55 crc kubenswrapper[4795]: I0320 18:29:55.914780 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-6cw7v_b47e6216-2e29-4d58-8b0c-5970aee6307b/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.140011 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-828jr_750d9405-0514-4876-821e-9ab1f6871e87/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.258031 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-jbwss_46248665-6f9f-46e0-8db7-6be8c47cf521/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.396448 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-rv5df_e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.533470 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-6z7j5_933bcfd5-f2d1-404f-876d-1d3da597f415/manager/0.log" Mar 20 18:29:56 crc kubenswrapper[4795]: I0320 18:29:56.630768 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56f44579c8-px2ft_0d8b26db-957e-4c0e-bb22-42f12d5beb0b/manager/0.log" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.147229 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:30:00 crc kubenswrapper[4795]: E0320 18:30:00.148211 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-utilities" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148230 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-utilities" Mar 20 18:30:00 crc kubenswrapper[4795]: E0320 18:30:00.148264 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148272 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4795]: E0320 18:30:00.148292 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-content" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148299 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="extract-content" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.148517 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d33b616-99f6-473a-8114-0203d0f7e9fb" containerName="registry-server" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.149318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.151236 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.151406 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.151461 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.154894 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g"] Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.155969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.158362 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.158564 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.171755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.205682 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g"] Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.253398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.254214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.254252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"auto-csr-approver-29567190-fmpq8\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.254296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356412 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"auto-csr-approver-29567190-fmpq8\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.356609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.357374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.366471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.967276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"auto-csr-approver-29567190-fmpq8\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:00 crc kubenswrapper[4795]: I0320 18:30:00.967587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"collect-profiles-29567190-k669g\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.073315 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.107805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.629194 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:30:01 crc kubenswrapper[4795]: W0320 18:30:01.635299 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a6542a_c4e9_4747_89bd_b15f37e98854.slice/crio-e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64 WatchSource:0}: Error finding container e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64: Status 404 returned error can't find the container with id e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64 Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.639338 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g"] Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.722091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" event={"ID":"f4a6542a-c4e9-4747-89bd-b15f37e98854","Type":"ContainerStarted","Data":"e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64"} Mar 20 18:30:01 crc kubenswrapper[4795]: I0320 18:30:01.722918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" event={"ID":"a7e72f01-1ab6-47a2-99d2-ff2778039c34","Type":"ContainerStarted","Data":"da4b300832cbaf79cd4467333ae6ea3634f2faeac48948a563d0a49de3b8854a"} Mar 20 18:30:02 crc kubenswrapper[4795]: I0320 18:30:02.731352 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerID="c6eef357aa6aa565680007e12db67df115ab8783e62776b7bb3c464aafd1537a" exitCode=0 Mar 20 18:30:02 crc kubenswrapper[4795]: I0320 18:30:02.731404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" event={"ID":"f4a6542a-c4e9-4747-89bd-b15f37e98854","Type":"ContainerDied","Data":"c6eef357aa6aa565680007e12db67df115ab8783e62776b7bb3c464aafd1537a"} Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.076537 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.134947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") pod \"f4a6542a-c4e9-4747-89bd-b15f37e98854\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.135501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") pod \"f4a6542a-c4e9-4747-89bd-b15f37e98854\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.135581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") pod \"f4a6542a-c4e9-4747-89bd-b15f37e98854\" (UID: \"f4a6542a-c4e9-4747-89bd-b15f37e98854\") " Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.136223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4a6542a-c4e9-4747-89bd-b15f37e98854" (UID: "f4a6542a-c4e9-4747-89bd-b15f37e98854"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.143313 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4a6542a-c4e9-4747-89bd-b15f37e98854" (UID: "f4a6542a-c4e9-4747-89bd-b15f37e98854"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.143785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm" (OuterVolumeSpecName: "kube-api-access-4gwtm") pod "f4a6542a-c4e9-4747-89bd-b15f37e98854" (UID: "f4a6542a-c4e9-4747-89bd-b15f37e98854"). InnerVolumeSpecName "kube-api-access-4gwtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.237362 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4a6542a-c4e9-4747-89bd-b15f37e98854-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.237392 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gwtm\" (UniqueName: \"kubernetes.io/projected/f4a6542a-c4e9-4747-89bd-b15f37e98854-kube-api-access-4gwtm\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.237401 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4a6542a-c4e9-4747-89bd-b15f37e98854-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.755784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" event={"ID":"f4a6542a-c4e9-4747-89bd-b15f37e98854","Type":"ContainerDied","Data":"e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64"} Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.755813 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-k669g" Mar 20 18:30:04 crc kubenswrapper[4795]: I0320 18:30:04.755825 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5520210b9b99c61c45c3d36fc550a02e6df4ef7eca77ff977c259d6ba93bd64" Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.152415 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.167928 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-fpxpb"] Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.262114 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd60241d-b207-4a9a-86b6-3be32ab282d3" path="/var/lib/kubelet/pods/cd60241d-b207-4a9a-86b6-3be32ab282d3/volumes" Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.773943 4795 generic.go:334] "Generic (PLEG): container finished" podID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerID="e6a437e3ef5671482fc87ddf7b0443a4a6151e38d08d0d94800ebfc859f95be2" exitCode=0 Mar 20 18:30:05 crc kubenswrapper[4795]: I0320 18:30:05.774002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" event={"ID":"a7e72f01-1ab6-47a2-99d2-ff2778039c34","Type":"ContainerDied","Data":"e6a437e3ef5671482fc87ddf7b0443a4a6151e38d08d0d94800ebfc859f95be2"} Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.128501 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.288803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") pod \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\" (UID: \"a7e72f01-1ab6-47a2-99d2-ff2778039c34\") " Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.300458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw" (OuterVolumeSpecName: "kube-api-access-hs6fw") pod "a7e72f01-1ab6-47a2-99d2-ff2778039c34" (UID: "a7e72f01-1ab6-47a2-99d2-ff2778039c34"). InnerVolumeSpecName "kube-api-access-hs6fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.391432 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs6fw\" (UniqueName: \"kubernetes.io/projected/a7e72f01-1ab6-47a2-99d2-ff2778039c34-kube-api-access-hs6fw\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.794195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" event={"ID":"a7e72f01-1ab6-47a2-99d2-ff2778039c34","Type":"ContainerDied","Data":"da4b300832cbaf79cd4467333ae6ea3634f2faeac48948a563d0a49de3b8854a"} Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.794238 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4b300832cbaf79cd4467333ae6ea3634f2faeac48948a563d0a49de3b8854a" Mar 20 18:30:07 crc kubenswrapper[4795]: I0320 18:30:07.794251 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-fmpq8" Mar 20 18:30:08 crc kubenswrapper[4795]: I0320 18:30:08.190502 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:30:08 crc kubenswrapper[4795]: I0320 18:30:08.201951 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-gzns5"] Mar 20 18:30:09 crc kubenswrapper[4795]: I0320 18:30:09.266553 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133e7bff-461c-4450-bf3b-8d43791045a4" path="/var/lib/kubelet/pods/133e7bff-461c-4450-bf3b-8d43791045a4/volumes" Mar 20 18:30:12 crc kubenswrapper[4795]: I0320 18:30:12.039312 4795 scope.go:117] "RemoveContainer" containerID="ccece14cbf4b4c8c9889d9ebad0a41bd1c87c88349be10c251ccd8a08eb4cac4" Mar 20 18:30:12 crc kubenswrapper[4795]: I0320 18:30:12.069215 4795 scope.go:117] "RemoveContainer" containerID="7e78ac608afa56e8111695b336413ee802aca06929422f0042e8a413df5d1f4a" Mar 20 18:30:17 crc kubenswrapper[4795]: I0320 18:30:17.147080 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-knn77_cd9b8a97-1b9d-4365-a985-a02d4078e3c2/control-plane-machine-set-operator/0.log" Mar 20 18:30:17 crc kubenswrapper[4795]: I0320 18:30:17.330086 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/kube-rbac-proxy/0.log" Mar 20 18:30:17 crc kubenswrapper[4795]: I0320 18:30:17.382288 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/machine-api-operator/0.log" Mar 20 18:30:32 crc kubenswrapper[4795]: I0320 18:30:32.115182 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lqmsr_5231a25a-8bda-4f72-8a81-e5a49cdc31eb/cert-manager-controller/0.log" Mar 20 18:30:32 crc kubenswrapper[4795]: I0320 18:30:32.218926 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-smr2n_7df834a3-0298-4cc9-8b4e-49ce3f51183e/cert-manager-cainjector/0.log" Mar 20 18:30:32 crc kubenswrapper[4795]: I0320 18:30:32.323304 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cff8c_88832f68-9f72-4321-8d3f-bb3e23465fdb/cert-manager-webhook/0.log" Mar 20 18:30:41 crc kubenswrapper[4795]: I0320 18:30:41.299949 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:30:41 crc kubenswrapper[4795]: I0320 18:30:41.300604 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:30:46 crc kubenswrapper[4795]: I0320 18:30:46.906286 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5jfjl_d34761db-41bf-4e5f-bdca-8c25e281c924/nmstate-console-plugin/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.141062 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bsp49_e070281f-65f5-4c6d-b012-06c027393646/nmstate-handler/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.163825 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/kube-rbac-proxy/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.193606 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/nmstate-metrics/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.392720 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mjhsq_f50011ef-d180-4d84-ba10-a2da522a579d/nmstate-webhook/0.log" Mar 20 18:30:47 crc kubenswrapper[4795]: I0320 18:30:47.452588 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-dlcps_efca4120-31ef-4c52-a6da-59b33144a979/nmstate-operator/0.log" Mar 20 18:31:11 crc kubenswrapper[4795]: I0320 18:31:11.299839 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:31:11 crc kubenswrapper[4795]: I0320 18:31:11.300657 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.461336 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/kube-rbac-proxy/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.502625 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/controller/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.545174 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.672386 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.699370 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.699445 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.748236 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:19 crc kubenswrapper[4795]: I0320 18:31:19.980738 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.010173 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.010535 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.033985 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.762060 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.774328 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/controller/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.779469 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.787832 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.957150 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy-frr/0.log" Mar 20 18:31:20 crc kubenswrapper[4795]: I0320 18:31:20.980317 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr-metrics/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.015845 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.179836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/reloader/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.240299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jw5dk_377dbbb7-0571-40cd-9fe3-3c86fbf4f092/frr-k8s-webhook-server/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.503176 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7547f4d8c8-499mj_0e8dba8d-8387-4ced-ac54-b8d5e1cf3650/manager/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.755388 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5ffc48dc7-t9vwn_2d29ac93-da31-4834-a858-d5bd9adb28d1/webhook-server/0.log" Mar 20 18:31:21 crc kubenswrapper[4795]: I0320 18:31:21.761461 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/kube-rbac-proxy/0.log" Mar 20 18:31:22 crc kubenswrapper[4795]: I0320 18:31:22.415537 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/speaker/0.log" Mar 20 18:31:22 crc kubenswrapper[4795]: I0320 18:31:22.523395 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.426768 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.661478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.704790 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.718543 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.900336 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.914863 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/extract/0.log" Mar 20 18:31:36 crc kubenswrapper[4795]: I0320 18:31:36.943950 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.078584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.260287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.262586 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.267292 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.444413 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.449528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.468276 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/extract/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.592424 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.803044 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.811495 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.837853 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.980099 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:31:37 crc kubenswrapper[4795]: I0320 18:31:37.999570 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.214907 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.411898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/registry-server/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.428805 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.437274 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.455945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.619867 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.620049 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.928867 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8x76m_a2de2777-57e1-4310-a878-1cfc1fc77e44/marketplace-operator/0.log" Mar 20 18:31:38 crc kubenswrapper[4795]: I0320 18:31:38.950496 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.135323 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/registry-server/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.186733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.861417 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:31:39 crc kubenswrapper[4795]: I0320 18:31:39.891378 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.052788 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.059331 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.230849 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/registry-server/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.288550 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.425485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.457856 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.458972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.618416 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:31:40 crc kubenswrapper[4795]: I0320 18:31:40.645981 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.125308 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/registry-server/0.log" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.299959 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.300018 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.300069 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.301043 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.301120 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" gracePeriod=600 Mar 20 18:31:41 crc kubenswrapper[4795]: E0320 18:31:41.531379 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688010 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" exitCode=0 Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761"} Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688117 4795 scope.go:117] "RemoveContainer" containerID="6a5d9213747be81871bb974d3fd3e36f0b57d64c41b528ec627ea4072bdbd9dc" Mar 20 18:31:41 crc kubenswrapper[4795]: I0320 18:31:41.688867 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:31:41 crc kubenswrapper[4795]: E0320 18:31:41.689163 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:31:53 crc kubenswrapper[4795]: I0320 18:31:53.254968 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:31:53 crc kubenswrapper[4795]: E0320 18:31:53.256123 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.142512 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:32:00 crc kubenswrapper[4795]: E0320 18:32:00.143343 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143356 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4795]: E0320 18:32:00.143393 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143400 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143572 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" containerName="oc" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.143582 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a6542a-c4e9-4747-89bd-b15f37e98854" containerName="collect-profiles" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.144157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.151753 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.152581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.152646 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.152821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.270118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"auto-csr-approver-29567192-4p2jk\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.371940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"auto-csr-approver-29567192-4p2jk\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.411429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"auto-csr-approver-29567192-4p2jk\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.466878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:00 crc kubenswrapper[4795]: I0320 18:32:00.977249 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:32:01 crc kubenswrapper[4795]: I0320 18:32:01.877063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" event={"ID":"6cec1e8e-999e-44e2-a9b5-387a10c5de11","Type":"ContainerStarted","Data":"d03146044402c32a68ed0339fdaeb2a74f4f65f66e1f308a5a0fa66c04fc2f6f"} Mar 20 18:32:02 crc kubenswrapper[4795]: I0320 18:32:02.888497 4795 generic.go:334] "Generic (PLEG): container finished" podID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerID="aef631dab3c0134ed5918f9313f1d6153c072f4d69296d8ee64df68955ab56a0" exitCode=0 Mar 20 18:32:02 crc kubenswrapper[4795]: I0320 18:32:02.888630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" event={"ID":"6cec1e8e-999e-44e2-a9b5-387a10c5de11","Type":"ContainerDied","Data":"aef631dab3c0134ed5918f9313f1d6153c072f4d69296d8ee64df68955ab56a0"} Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.276094 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.449774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") pod \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\" (UID: \"6cec1e8e-999e-44e2-a9b5-387a10c5de11\") " Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.457666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h" (OuterVolumeSpecName: "kube-api-access-j2x4h") pod "6cec1e8e-999e-44e2-a9b5-387a10c5de11" (UID: "6cec1e8e-999e-44e2-a9b5-387a10c5de11"). InnerVolumeSpecName "kube-api-access-j2x4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.551628 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/6cec1e8e-999e-44e2-a9b5-387a10c5de11-kube-api-access-j2x4h\") on node \"crc\" DevicePath \"\"" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.907913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" event={"ID":"6cec1e8e-999e-44e2-a9b5-387a10c5de11","Type":"ContainerDied","Data":"d03146044402c32a68ed0339fdaeb2a74f4f65f66e1f308a5a0fa66c04fc2f6f"} Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.907977 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567192-4p2jk" Mar 20 18:32:04 crc kubenswrapper[4795]: I0320 18:32:04.908041 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03146044402c32a68ed0339fdaeb2a74f4f65f66e1f308a5a0fa66c04fc2f6f" Mar 20 18:32:05 crc kubenswrapper[4795]: I0320 18:32:05.341606 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:32:05 crc kubenswrapper[4795]: I0320 18:32:05.348420 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-l88zk"] Mar 20 18:32:07 crc kubenswrapper[4795]: I0320 18:32:07.259386 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:07 crc kubenswrapper[4795]: E0320 18:32:07.261174 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:07 crc kubenswrapper[4795]: I0320 18:32:07.262278 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654cb8e4-7fd7-4e3e-955a-a71906ccfb79" path="/var/lib/kubelet/pods/654cb8e4-7fd7-4e3e-955a-a71906ccfb79/volumes" Mar 20 18:32:12 crc kubenswrapper[4795]: I0320 18:32:12.183375 4795 scope.go:117] "RemoveContainer" containerID="43b386b54b2c5ae34c509074586a617552173f2543677683ef8d11caf140f2f9" Mar 20 18:32:20 crc kubenswrapper[4795]: I0320 18:32:20.252538 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:20 crc kubenswrapper[4795]: E0320 18:32:20.253426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:32 crc kubenswrapper[4795]: I0320 18:32:32.252470 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:32 crc kubenswrapper[4795]: E0320 18:32:32.253638 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:43 crc kubenswrapper[4795]: I0320 18:32:43.253587 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:43 crc kubenswrapper[4795]: E0320 18:32:43.254369 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:32:54 crc kubenswrapper[4795]: I0320 18:32:54.252721 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:32:54 crc kubenswrapper[4795]: E0320 18:32:54.253632 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:06 crc kubenswrapper[4795]: I0320 18:33:06.252728 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:06 crc kubenswrapper[4795]: E0320 18:33:06.253953 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:17 crc kubenswrapper[4795]: I0320 18:33:17.263739 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:17 crc kubenswrapper[4795]: E0320 18:33:17.264956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:30 crc kubenswrapper[4795]: I0320 18:33:30.252157 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:30 crc kubenswrapper[4795]: E0320 18:33:30.253426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:44 crc kubenswrapper[4795]: I0320 18:33:44.253128 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:44 crc kubenswrapper[4795]: E0320 18:33:44.254362 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:58 crc kubenswrapper[4795]: I0320 18:33:58.252662 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:33:58 crc kubenswrapper[4795]: E0320 18:33:58.254142 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.145793 4795 generic.go:334] "Generic (PLEG): container finished" podID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" exitCode=0 Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.145837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" event={"ID":"a508da41-3cdb-4b99-b14e-a917c5153c72","Type":"ContainerDied","Data":"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553"} Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.146422 4795 scope.go:117] "RemoveContainer" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:33:59 crc kubenswrapper[4795]: I0320 18:33:59.264781 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2wfg_must-gather-gb8cc_a508da41-3cdb-4b99-b14e-a917c5153c72/gather/0.log" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.152088 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:34:00 crc kubenswrapper[4795]: E0320 18:34:00.153019 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerName="oc" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.153049 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerName="oc" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.153350 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" containerName="oc" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.154132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.156290 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.156578 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.159790 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.161309 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.197909 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"auto-csr-approver-29567194-t7ml6\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.299980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"auto-csr-approver-29567194-t7ml6\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.318270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"auto-csr-approver-29567194-t7ml6\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.477882 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:00 crc kubenswrapper[4795]: I0320 18:34:00.938992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:34:01 crc kubenswrapper[4795]: I0320 18:34:01.165954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" event={"ID":"7ce7186d-a505-4b16-ae93-2d95886d5f2d","Type":"ContainerStarted","Data":"04c81d063f980e1aa073cef625e10b89bbeec4092afbf2754d0a2463270ef2a3"} Mar 20 18:34:04 crc kubenswrapper[4795]: I0320 18:34:04.192337 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerID="640523aab60a0eb7070cfd04917b68ca823bf4802ffb825df958dcc7af70501e" exitCode=0 Mar 20 18:34:04 crc kubenswrapper[4795]: I0320 18:34:04.192444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" event={"ID":"7ce7186d-a505-4b16-ae93-2d95886d5f2d","Type":"ContainerDied","Data":"640523aab60a0eb7070cfd04917b68ca823bf4802ffb825df958dcc7af70501e"} Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.622061 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.701738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") pod \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\" (UID: \"7ce7186d-a505-4b16-ae93-2d95886d5f2d\") " Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.710904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2" (OuterVolumeSpecName: "kube-api-access-ljfk2") pod "7ce7186d-a505-4b16-ae93-2d95886d5f2d" (UID: "7ce7186d-a505-4b16-ae93-2d95886d5f2d"). InnerVolumeSpecName "kube-api-access-ljfk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:34:05 crc kubenswrapper[4795]: I0320 18:34:05.804494 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljfk2\" (UniqueName: \"kubernetes.io/projected/7ce7186d-a505-4b16-ae93-2d95886d5f2d-kube-api-access-ljfk2\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.215826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" event={"ID":"7ce7186d-a505-4b16-ae93-2d95886d5f2d","Type":"ContainerDied","Data":"04c81d063f980e1aa073cef625e10b89bbeec4092afbf2754d0a2463270ef2a3"} Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.215864 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c81d063f980e1aa073cef625e10b89bbeec4092afbf2754d0a2463270ef2a3" Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.215874 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567194-t7ml6" Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.749991 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:34:06 crc kubenswrapper[4795]: I0320 18:34:06.763657 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-97rr4"] Mar 20 18:34:07 crc kubenswrapper[4795]: I0320 18:34:07.267855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551830bd-5613-42fb-b4ad-b1c6c6a0b09c" path="/var/lib/kubelet/pods/551830bd-5613-42fb-b4ad-b1c6c6a0b09c/volumes" Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.331837 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.332628 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" containerID="cri-o://752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" gracePeriod=2 Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.340517 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n2wfg/must-gather-gb8cc"] Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.933546 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2wfg_must-gather-gb8cc_a508da41-3cdb-4b99-b14e-a917c5153c72/copy/0.log" Mar 20 18:34:08 crc kubenswrapper[4795]: I0320 18:34:08.934438 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.067342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") pod \"a508da41-3cdb-4b99-b14e-a917c5153c72\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.067625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") pod \"a508da41-3cdb-4b99-b14e-a917c5153c72\" (UID: \"a508da41-3cdb-4b99-b14e-a917c5153c72\") " Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.082264 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6" (OuterVolumeSpecName: "kube-api-access-wq4p6") pod "a508da41-3cdb-4b99-b14e-a917c5153c72" (UID: "a508da41-3cdb-4b99-b14e-a917c5153c72"). InnerVolumeSpecName "kube-api-access-wq4p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.169733 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq4p6\" (UniqueName: \"kubernetes.io/projected/a508da41-3cdb-4b99-b14e-a917c5153c72-kube-api-access-wq4p6\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.237619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a508da41-3cdb-4b99-b14e-a917c5153c72" (UID: "a508da41-3cdb-4b99-b14e-a917c5153c72"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.250763 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n2wfg_must-gather-gb8cc_a508da41-3cdb-4b99-b14e-a917c5153c72/copy/0.log" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.251150 4795 generic.go:334] "Generic (PLEG): container finished" podID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" exitCode=143 Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.251298 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n2wfg/must-gather-gb8cc" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.261756 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" path="/var/lib/kubelet/pods/a508da41-3cdb-4b99-b14e-a917c5153c72/volumes" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.262389 4795 scope.go:117] "RemoveContainer" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.273937 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a508da41-3cdb-4b99-b14e-a917c5153c72-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.285324 4795 scope.go:117] "RemoveContainer" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.324469 4795 scope.go:117] "RemoveContainer" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" Mar 20 18:34:09 crc kubenswrapper[4795]: E0320 18:34:09.324899 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11\": container with ID starting with 752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11 not found: ID does not exist" containerID="752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.324947 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11"} err="failed to get container status \"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11\": rpc error: code = NotFound desc = could not find container \"752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11\": container with ID starting with 752bfa1d44a6386a9d8d1a4a4fee5855b9b7801a13b61fe657c231e42d2cec11 not found: ID does not exist" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.325027 4795 scope.go:117] "RemoveContainer" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:34:09 crc kubenswrapper[4795]: E0320 18:34:09.325342 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553\": container with ID starting with fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553 not found: ID does not exist" containerID="fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553" Mar 20 18:34:09 crc kubenswrapper[4795]: I0320 18:34:09.325389 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553"} err="failed to get container status \"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553\": rpc error: code = NotFound desc = could not find container \"fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553\": container with ID starting with fd43dcc7fb9218b1e9ba307dc698b1b5c2ed168d2cf100b1219a8238e36f7553 not found: ID does not exist" Mar 20 18:34:12 crc kubenswrapper[4795]: I0320 18:34:12.299936 4795 scope.go:117] "RemoveContainer" containerID="ac8208908e414910c05525aac4a0b345cc75b5ac6d2db89cd45e2d5c13fcd4e8" Mar 20 18:34:12 crc kubenswrapper[4795]: I0320 18:34:12.322681 4795 scope.go:117] "RemoveContainer" containerID="29253cb593d65e36df8393e9b2e7d2df325902972a57f35ad5e0d8767eaa777e" Mar 20 18:34:13 crc kubenswrapper[4795]: I0320 18:34:13.253328 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:13 crc kubenswrapper[4795]: E0320 18:34:13.255305 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:34:26 crc kubenswrapper[4795]: I0320 18:34:26.252216 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:26 crc kubenswrapper[4795]: E0320 18:34:26.253362 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:34:39 crc kubenswrapper[4795]: I0320 18:34:39.252898 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:39 crc kubenswrapper[4795]: E0320 18:34:39.253552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:34:51 crc kubenswrapper[4795]: I0320 18:34:51.253288 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:34:51 crc kubenswrapper[4795]: E0320 18:34:51.254719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:03 crc kubenswrapper[4795]: I0320 18:35:03.252768 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:03 crc kubenswrapper[4795]: E0320 18:35:03.253703 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:12 crc kubenswrapper[4795]: I0320 18:35:12.442123 4795 scope.go:117] "RemoveContainer" containerID="740fd31422fdefe167ff9449396e8b7542957aeb0fadac8ada69ea60e5fabaf3" Mar 20 18:35:15 crc kubenswrapper[4795]: I0320 18:35:15.257471 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:15 crc kubenswrapper[4795]: E0320 18:35:15.258418 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.747032 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:19 crc kubenswrapper[4795]: E0320 18:35:19.748201 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerName="oc" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748222 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerName="oc" Mar 20 18:35:19 crc kubenswrapper[4795]: E0320 18:35:19.748265 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="gather" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748279 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="gather" Mar 20 18:35:19 crc kubenswrapper[4795]: E0320 18:35:19.748320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748333 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748651 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="copy" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748723 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" containerName="oc" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.748755 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a508da41-3cdb-4b99-b14e-a917c5153c72" containerName="gather" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.751535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.759355 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.861110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.861225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.861367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.963975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.964095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:19 crc kubenswrapper[4795]: I0320 18:35:19.985780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"redhat-operators-sk5fl\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:20 crc kubenswrapper[4795]: I0320 18:35:20.082863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:20 crc kubenswrapper[4795]: I0320 18:35:20.560567 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.005532 4795 generic.go:334] "Generic (PLEG): container finished" podID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" exitCode=0 Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.005635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242"} Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.005917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerStarted","Data":"ef650a02c257831f7b67cbc0f959703c18d4a6a48e121217bcbe9ea59e4d7d58"} Mar 20 18:35:21 crc kubenswrapper[4795]: I0320 18:35:21.007929 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.039765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerStarted","Data":"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081"} Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.516673 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.518742 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.530423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.530616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.530661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.548033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.633882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.659592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"community-operators-x5256\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:23 crc kubenswrapper[4795]: I0320 18:35:23.845419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:24 crc kubenswrapper[4795]: I0320 18:35:24.049596 4795 generic.go:334] "Generic (PLEG): container finished" podID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" exitCode=0 Mar 20 18:35:24 crc kubenswrapper[4795]: I0320 18:35:24.049995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081"} Mar 20 18:35:24 crc kubenswrapper[4795]: I0320 18:35:24.844562 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:24 crc kubenswrapper[4795]: W0320 18:35:24.848392 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb25b0297_a790_4365_aa94_c551db2f983d.slice/crio-a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e WatchSource:0}: Error finding container a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e: Status 404 returned error can't find the container with id a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.059981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerStarted","Data":"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7"} Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.061486 4795 generic.go:334] "Generic (PLEG): container finished" podID="b25b0297-a790-4365-aa94-c551db2f983d" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" exitCode=0 Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.061512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c"} Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.061530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerStarted","Data":"a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e"} Mar 20 18:35:25 crc kubenswrapper[4795]: I0320 18:35:25.083652 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sk5fl" podStartSLOduration=2.589637596 podStartE2EDuration="6.083632332s" podCreationTimestamp="2026-03-20 18:35:19 +0000 UTC" firstStartedPulling="2026-03-20 18:35:21.007650689 +0000 UTC m=+4664.465682230" lastFinishedPulling="2026-03-20 18:35:24.501645425 +0000 UTC m=+4667.959676966" observedRunningTime="2026-03-20 18:35:25.076528411 +0000 UTC m=+4668.534559952" watchObservedRunningTime="2026-03-20 18:35:25.083632332 +0000 UTC m=+4668.541663873" Mar 20 18:35:27 crc kubenswrapper[4795]: I0320 18:35:27.082714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerStarted","Data":"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e"} Mar 20 18:35:28 crc kubenswrapper[4795]: I0320 18:35:28.097228 4795 generic.go:334] "Generic (PLEG): container finished" podID="b25b0297-a790-4365-aa94-c551db2f983d" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" exitCode=0 Mar 20 18:35:28 crc kubenswrapper[4795]: I0320 18:35:28.097302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e"} Mar 20 18:35:28 crc kubenswrapper[4795]: I0320 18:35:28.252410 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:28 crc kubenswrapper[4795]: E0320 18:35:28.252991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:29 crc kubenswrapper[4795]: I0320 18:35:29.106233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerStarted","Data":"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28"} Mar 20 18:35:29 crc kubenswrapper[4795]: I0320 18:35:29.125991 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x5256" podStartSLOduration=2.673062489 podStartE2EDuration="6.125976022s" podCreationTimestamp="2026-03-20 18:35:23 +0000 UTC" firstStartedPulling="2026-03-20 18:35:25.062846557 +0000 UTC m=+4668.520878098" lastFinishedPulling="2026-03-20 18:35:28.51576009 +0000 UTC m=+4671.973791631" observedRunningTime="2026-03-20 18:35:29.123432573 +0000 UTC m=+4672.581464114" watchObservedRunningTime="2026-03-20 18:35:29.125976022 +0000 UTC m=+4672.584007563" Mar 20 18:35:30 crc kubenswrapper[4795]: I0320 18:35:30.083958 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:30 crc kubenswrapper[4795]: I0320 18:35:30.084272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:31 crc kubenswrapper[4795]: I0320 18:35:31.133675 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sk5fl" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" probeResult="failure" output=< Mar 20 18:35:31 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:35:31 crc kubenswrapper[4795]: > Mar 20 18:35:33 crc kubenswrapper[4795]: I0320 18:35:33.846031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:33 crc kubenswrapper[4795]: I0320 18:35:33.846892 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:33 crc kubenswrapper[4795]: I0320 18:35:33.935365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:34 crc kubenswrapper[4795]: I0320 18:35:34.205516 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:34 crc kubenswrapper[4795]: I0320 18:35:34.278806 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:36 crc kubenswrapper[4795]: I0320 18:35:36.174087 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x5256" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" containerID="cri-o://18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" gracePeriod=2 Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.183361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185443 4795 generic.go:334] "Generic (PLEG): container finished" podID="b25b0297-a790-4365-aa94-c551db2f983d" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" exitCode=0 Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28"} Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5256" event={"ID":"b25b0297-a790-4365-aa94-c551db2f983d","Type":"ContainerDied","Data":"a6466de712257e7c53fe2f0a704fae36469bf2f5d079ca120c58e0d49592295e"} Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.185531 4795 scope.go:117] "RemoveContainer" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.219513 4795 scope.go:117] "RemoveContainer" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.236783 4795 scope.go:117] "RemoveContainer" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285103 4795 scope.go:117] "RemoveContainer" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" Mar 20 18:35:37 crc kubenswrapper[4795]: E0320 18:35:37.285505 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28\": container with ID starting with 18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28 not found: ID does not exist" containerID="18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285537 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28"} err="failed to get container status \"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28\": rpc error: code = NotFound desc = could not find container \"18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28\": container with ID starting with 18a49318bc0d50993c30415d86c36f66afa4baaa7821b46939a852ce180b9a28 not found: ID does not exist" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285556 4795 scope.go:117] "RemoveContainer" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" Mar 20 18:35:37 crc kubenswrapper[4795]: E0320 18:35:37.285802 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e\": container with ID starting with 316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e not found: ID does not exist" containerID="316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285824 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e"} err="failed to get container status \"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e\": rpc error: code = NotFound desc = could not find container \"316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e\": container with ID starting with 316ec7a8370891b59b49a88bf685cd0f039c1a58c0cee0d23a45b310cb7bdf0e not found: ID does not exist" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.285838 4795 scope.go:117] "RemoveContainer" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" Mar 20 18:35:37 crc kubenswrapper[4795]: E0320 18:35:37.286396 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c\": container with ID starting with ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c not found: ID does not exist" containerID="ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.286493 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c"} err="failed to get container status \"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c\": rpc error: code = NotFound desc = could not find container \"ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c\": container with ID starting with ca755ea3584e9836b90a4b3a604244483628aba4d25c4396503d523e5eba3f4c not found: ID does not exist" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.301853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") pod \"b25b0297-a790-4365-aa94-c551db2f983d\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.302047 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") pod \"b25b0297-a790-4365-aa94-c551db2f983d\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.302160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") pod \"b25b0297-a790-4365-aa94-c551db2f983d\" (UID: \"b25b0297-a790-4365-aa94-c551db2f983d\") " Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.309330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t" (OuterVolumeSpecName: "kube-api-access-rjt5t") pod "b25b0297-a790-4365-aa94-c551db2f983d" (UID: "b25b0297-a790-4365-aa94-c551db2f983d"). InnerVolumeSpecName "kube-api-access-rjt5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.315767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities" (OuterVolumeSpecName: "utilities") pod "b25b0297-a790-4365-aa94-c551db2f983d" (UID: "b25b0297-a790-4365-aa94-c551db2f983d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.361558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b25b0297-a790-4365-aa94-c551db2f983d" (UID: "b25b0297-a790-4365-aa94-c551db2f983d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.403861 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.403908 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjt5t\" (UniqueName: \"kubernetes.io/projected/b25b0297-a790-4365-aa94-c551db2f983d-kube-api-access-rjt5t\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:37 crc kubenswrapper[4795]: I0320 18:35:37.403919 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b25b0297-a790-4365-aa94-c551db2f983d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:38 crc kubenswrapper[4795]: I0320 18:35:38.197874 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5256" Mar 20 18:35:38 crc kubenswrapper[4795]: I0320 18:35:38.242709 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:38 crc kubenswrapper[4795]: I0320 18:35:38.258140 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x5256"] Mar 20 18:35:39 crc kubenswrapper[4795]: I0320 18:35:39.268805 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25b0297-a790-4365-aa94-c551db2f983d" path="/var/lib/kubelet/pods/b25b0297-a790-4365-aa94-c551db2f983d/volumes" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.140214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.201639 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.252987 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:40 crc kubenswrapper[4795]: E0320 18:35:40.253586 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:35:40 crc kubenswrapper[4795]: I0320 18:35:40.590871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.268049 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sk5fl" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" containerID="cri-o://929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" gracePeriod=2 Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.809955 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.885112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") pod \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.886901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") pod \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.886976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") pod \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\" (UID: \"31c30b03-2316-4ccf-bb7d-379da1b6ba23\") " Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.888339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities" (OuterVolumeSpecName: "utilities") pod "31c30b03-2316-4ccf-bb7d-379da1b6ba23" (UID: "31c30b03-2316-4ccf-bb7d-379da1b6ba23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.895495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv" (OuterVolumeSpecName: "kube-api-access-xpvbv") pod "31c30b03-2316-4ccf-bb7d-379da1b6ba23" (UID: "31c30b03-2316-4ccf-bb7d-379da1b6ba23"). InnerVolumeSpecName "kube-api-access-xpvbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.989797 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpvbv\" (UniqueName: \"kubernetes.io/projected/31c30b03-2316-4ccf-bb7d-379da1b6ba23-kube-api-access-xpvbv\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:41 crc kubenswrapper[4795]: I0320 18:35:41.989829 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.018354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31c30b03-2316-4ccf-bb7d-379da1b6ba23" (UID: "31c30b03-2316-4ccf-bb7d-379da1b6ba23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.092929 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31c30b03-2316-4ccf-bb7d-379da1b6ba23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279517 4795 generic.go:334] "Generic (PLEG): container finished" podID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" exitCode=0 Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7"} Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sk5fl" event={"ID":"31c30b03-2316-4ccf-bb7d-379da1b6ba23","Type":"ContainerDied","Data":"ef650a02c257831f7b67cbc0f959703c18d4a6a48e121217bcbe9ea59e4d7d58"} Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279612 4795 scope.go:117] "RemoveContainer" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.279834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sk5fl" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.314153 4795 scope.go:117] "RemoveContainer" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.337608 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.344999 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sk5fl"] Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.353527 4795 scope.go:117] "RemoveContainer" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.377970 4795 scope.go:117] "RemoveContainer" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.380185 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7\": container with ID starting with 929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7 not found: ID does not exist" containerID="929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380216 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7"} err="failed to get container status \"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7\": rpc error: code = NotFound desc = could not find container \"929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7\": container with ID starting with 929128942c8209de5811ccce9c67e1851835a0a56eb849c9750f6d502d9ccec7 not found: ID does not exist" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380255 4795 scope.go:117] "RemoveContainer" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.380597 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081\": container with ID starting with 6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081 not found: ID does not exist" containerID="6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380640 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081"} err="failed to get container status \"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081\": rpc error: code = NotFound desc = could not find container \"6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081\": container with ID starting with 6b42d9ca53b4a7596e637fea990ea0c91c99d95aef4dee9c7168b35d645e3081 not found: ID does not exist" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380666 4795 scope.go:117] "RemoveContainer" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.380959 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242\": container with ID starting with bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242 not found: ID does not exist" containerID="bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242" Mar 20 18:35:42 crc kubenswrapper[4795]: I0320 18:35:42.380981 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242"} err="failed to get container status \"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242\": rpc error: code = NotFound desc = could not find container \"bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242\": container with ID starting with bac8e49a06666a5b19fc393c7639bbc80d1f8ab2405d63ef9c8b05b04094d242 not found: ID does not exist" Mar 20 18:35:42 crc kubenswrapper[4795]: E0320 18:35:42.486735 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c30b03_2316_4ccf_bb7d_379da1b6ba23.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c30b03_2316_4ccf_bb7d_379da1b6ba23.slice/crio-ef650a02c257831f7b67cbc0f959703c18d4a6a48e121217bcbe9ea59e4d7d58\": RecentStats: unable to find data in memory cache]" Mar 20 18:35:43 crc kubenswrapper[4795]: I0320 18:35:43.269272 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" path="/var/lib/kubelet/pods/31c30b03-2316-4ccf-bb7d-379da1b6ba23/volumes" Mar 20 18:35:53 crc kubenswrapper[4795]: I0320 18:35:53.252527 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:35:53 crc kubenswrapper[4795]: E0320 18:35:53.253321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.171280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.174515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.174581 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175307 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175379 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-utilities" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175410 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175425 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175460 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175472 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175529 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175542 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: E0320 18:36:00.175580 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.175591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="extract-content" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.176469 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c30b03-2316-4ccf-bb7d-379da1b6ba23" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.176512 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25b0297-a790-4365-aa94-c551db2f983d" containerName="registry-server" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.180193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.185474 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.185884 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.186343 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.197334 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.203409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"auto-csr-approver-29567196-t9x4q\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.305095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"auto-csr-approver-29567196-t9x4q\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.324418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"auto-csr-approver-29567196-t9x4q\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.504952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:00 crc kubenswrapper[4795]: I0320 18:36:00.955759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:36:00 crc kubenswrapper[4795]: W0320 18:36:00.960101 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7203100a_018c_4662_a760_a16bd5c6322d.slice/crio-556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8 WatchSource:0}: Error finding container 556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8: Status 404 returned error can't find the container with id 556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8 Mar 20 18:36:01 crc kubenswrapper[4795]: I0320 18:36:01.783985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" event={"ID":"7203100a-018c-4662-a760-a16bd5c6322d","Type":"ContainerStarted","Data":"556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8"} Mar 20 18:36:02 crc kubenswrapper[4795]: I0320 18:36:02.793404 4795 generic.go:334] "Generic (PLEG): container finished" podID="7203100a-018c-4662-a760-a16bd5c6322d" containerID="dd786a69be248d53a6715fb536c79a06b01be09807a6bb21bbca9e7786db827c" exitCode=0 Mar 20 18:36:02 crc kubenswrapper[4795]: I0320 18:36:02.793858 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" event={"ID":"7203100a-018c-4662-a760-a16bd5c6322d","Type":"ContainerDied","Data":"dd786a69be248d53a6715fb536c79a06b01be09807a6bb21bbca9e7786db827c"} Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.251234 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.389246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") pod \"7203100a-018c-4662-a760-a16bd5c6322d\" (UID: \"7203100a-018c-4662-a760-a16bd5c6322d\") " Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.399680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd" (OuterVolumeSpecName: "kube-api-access-wfqdd") pod "7203100a-018c-4662-a760-a16bd5c6322d" (UID: "7203100a-018c-4662-a760-a16bd5c6322d"). InnerVolumeSpecName "kube-api-access-wfqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.495388 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfqdd\" (UniqueName: \"kubernetes.io/projected/7203100a-018c-4662-a760-a16bd5c6322d-kube-api-access-wfqdd\") on node \"crc\" DevicePath \"\"" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.818976 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" event={"ID":"7203100a-018c-4662-a760-a16bd5c6322d","Type":"ContainerDied","Data":"556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8"} Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.819037 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567196-t9x4q" Mar 20 18:36:04 crc kubenswrapper[4795]: I0320 18:36:04.819054 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556f7b75c93fd7c3c4c6c56c03df3c5cc21a3b58f1801fa6e981e63a578098b8" Mar 20 18:36:05 crc kubenswrapper[4795]: I0320 18:36:05.329166 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:36:05 crc kubenswrapper[4795]: I0320 18:36:05.344287 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-fmpq8"] Mar 20 18:36:07 crc kubenswrapper[4795]: I0320 18:36:07.270156 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e72f01-1ab6-47a2-99d2-ff2778039c34" path="/var/lib/kubelet/pods/a7e72f01-1ab6-47a2-99d2-ff2778039c34/volumes" Mar 20 18:36:08 crc kubenswrapper[4795]: I0320 18:36:08.252276 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:08 crc kubenswrapper[4795]: E0320 18:36:08.252969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:12 crc kubenswrapper[4795]: I0320 18:36:12.506520 4795 scope.go:117] "RemoveContainer" containerID="e6a437e3ef5671482fc87ddf7b0443a4a6151e38d08d0d94800ebfc859f95be2" Mar 20 18:36:22 crc kubenswrapper[4795]: I0320 18:36:22.252399 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:22 crc kubenswrapper[4795]: E0320 18:36:22.253291 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:34 crc kubenswrapper[4795]: I0320 18:36:34.252381 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:34 crc kubenswrapper[4795]: E0320 18:36:34.253256 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:36:48 crc kubenswrapper[4795]: I0320 18:36:48.253428 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:36:49 crc kubenswrapper[4795]: I0320 18:36:49.320799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e"} Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.860655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:36:52 crc kubenswrapper[4795]: E0320 18:36:52.861424 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7203100a-018c-4662-a760-a16bd5c6322d" containerName="oc" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.861436 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7203100a-018c-4662-a760-a16bd5c6322d" containerName="oc" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.861619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7203100a-018c-4662-a760-a16bd5c6322d" containerName="oc" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.862579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.872161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4zsvz"/"default-dockercfg-kfszv" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.872258 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4zsvz"/"kube-root-ca.crt" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.872295 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4zsvz"/"openshift-service-ca.crt" Mar 20 18:36:52 crc kubenswrapper[4795]: I0320 18:36:52.881849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.058842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.059306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.161442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.161533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.162104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.180812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"must-gather-k5nt7\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.193320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:36:53 crc kubenswrapper[4795]: I0320 18:36:53.704302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.377715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerStarted","Data":"7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22"} Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.378350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerStarted","Data":"f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f"} Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.378373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerStarted","Data":"b6390a675269ff059a18fc9962b84b8fffd34764b948261b176265f106e342a0"} Mar 20 18:36:54 crc kubenswrapper[4795]: I0320 18:36:54.409527 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" podStartSLOduration=2.409502483 podStartE2EDuration="2.409502483s" podCreationTimestamp="2026-03-20 18:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:36:54.396664085 +0000 UTC m=+4757.854695636" watchObservedRunningTime="2026-03-20 18:36:54.409502483 +0000 UTC m=+4757.867534054" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.581436 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-stnvk"] Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.583065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.769155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.769225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.871520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.871586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.872025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.893198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"crc-debug-stnvk\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: I0320 18:36:57.899570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:36:57 crc kubenswrapper[4795]: W0320 18:36:57.941323 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b756ee6_eaba_4ef9_8aeb_932fa022ff67.slice/crio-0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a WatchSource:0}: Error finding container 0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a: Status 404 returned error can't find the container with id 0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a Mar 20 18:36:58 crc kubenswrapper[4795]: I0320 18:36:58.421281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" event={"ID":"1b756ee6-eaba-4ef9-8aeb-932fa022ff67","Type":"ContainerStarted","Data":"aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b"} Mar 20 18:36:58 crc kubenswrapper[4795]: I0320 18:36:58.421936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" event={"ID":"1b756ee6-eaba-4ef9-8aeb-932fa022ff67","Type":"ContainerStarted","Data":"0a4dc9044ec952b8781529a760ed519c3480b867616fce2b184ea3da7fd1787a"} Mar 20 18:37:37 crc kubenswrapper[4795]: I0320 18:37:37.791763 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerID="aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b" exitCode=0 Mar 20 18:37:37 crc kubenswrapper[4795]: I0320 18:37:37.791882 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" event={"ID":"1b756ee6-eaba-4ef9-8aeb-932fa022ff67","Type":"ContainerDied","Data":"aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b"} Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.905574 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.921588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") pod \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.921771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") pod \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\" (UID: \"1b756ee6-eaba-4ef9-8aeb-932fa022ff67\") " Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.924259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host" (OuterVolumeSpecName: "host") pod "1b756ee6-eaba-4ef9-8aeb-932fa022ff67" (UID: "1b756ee6-eaba-4ef9-8aeb-932fa022ff67"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.929268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7" (OuterVolumeSpecName: "kube-api-access-wghz7") pod "1b756ee6-eaba-4ef9-8aeb-932fa022ff67" (UID: "1b756ee6-eaba-4ef9-8aeb-932fa022ff67"). InnerVolumeSpecName "kube-api-access-wghz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.943902 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-stnvk"] Mar 20 18:37:38 crc kubenswrapper[4795]: I0320 18:37:38.951449 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-stnvk"] Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.025072 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.025121 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wghz7\" (UniqueName: \"kubernetes.io/projected/1b756ee6-eaba-4ef9-8aeb-932fa022ff67-kube-api-access-wghz7\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.278065 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" path="/var/lib/kubelet/pods/1b756ee6-eaba-4ef9-8aeb-932fa022ff67/volumes" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.811023 4795 scope.go:117] "RemoveContainer" containerID="aec27dd4223f5fe63f5054576fd5e2585cf7b3f8895b249700a87a28254da07b" Mar 20 18:37:39 crc kubenswrapper[4795]: I0320 18:37:39.811048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-stnvk" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.126554 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-t9n52"] Mar 20 18:37:40 crc kubenswrapper[4795]: E0320 18:37:40.127028 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerName="container-00" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.127045 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerName="container-00" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.127297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b756ee6-eaba-4ef9-8aeb-932fa022ff67" containerName="container-00" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.128195 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.247037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.247114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.348780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.348834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.348927 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.369454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"crc-debug-t9n52\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.449015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.819953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" event={"ID":"a2ef605a-8561-4538-9d09-b6635a813341","Type":"ContainerStarted","Data":"cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58"} Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.820009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" event={"ID":"a2ef605a-8561-4538-9d09-b6635a813341","Type":"ContainerStarted","Data":"16a3231887d7ca689198aece22c1fca21228396e2e6bf1e6133f03b90cbf7649"} Mar 20 18:37:40 crc kubenswrapper[4795]: I0320 18:37:40.836108 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" podStartSLOduration=0.836090887 podStartE2EDuration="836.090887ms" podCreationTimestamp="2026-03-20 18:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:37:40.830584057 +0000 UTC m=+4804.288615598" watchObservedRunningTime="2026-03-20 18:37:40.836090887 +0000 UTC m=+4804.294122418" Mar 20 18:37:41 crc kubenswrapper[4795]: I0320 18:37:41.844126 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2ef605a-8561-4538-9d09-b6635a813341" containerID="cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58" exitCode=0 Mar 20 18:37:41 crc kubenswrapper[4795]: I0320 18:37:41.844140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" event={"ID":"a2ef605a-8561-4538-9d09-b6635a813341","Type":"ContainerDied","Data":"cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58"} Mar 20 18:37:42 crc kubenswrapper[4795]: I0320 18:37:42.953249 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099001 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") pod \"a2ef605a-8561-4538-9d09-b6635a813341\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") pod \"a2ef605a-8561-4538-9d09-b6635a813341\" (UID: \"a2ef605a-8561-4538-9d09-b6635a813341\") " Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host" (OuterVolumeSpecName: "host") pod "a2ef605a-8561-4538-9d09-b6635a813341" (UID: "a2ef605a-8561-4538-9d09-b6635a813341"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.099653 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ef605a-8561-4538-9d09-b6635a813341-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.111024 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c" (OuterVolumeSpecName: "kube-api-access-xbk5c") pod "a2ef605a-8561-4538-9d09-b6635a813341" (UID: "a2ef605a-8561-4538-9d09-b6635a813341"). InnerVolumeSpecName "kube-api-access-xbk5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.118017 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-t9n52"] Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.128215 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-t9n52"] Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.201049 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbk5c\" (UniqueName: \"kubernetes.io/projected/a2ef605a-8561-4538-9d09-b6635a813341-kube-api-access-xbk5c\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.262776 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ef605a-8561-4538-9d09-b6635a813341" path="/var/lib/kubelet/pods/a2ef605a-8561-4538-9d09-b6635a813341/volumes" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.857136 4795 scope.go:117] "RemoveContainer" containerID="cd8251461060b3f830ae5964db666604d5a280ac18ed90797c8b57637d858b58" Mar 20 18:37:43 crc kubenswrapper[4795]: I0320 18:37:43.857189 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-t9n52" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.275542 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-sz4jb"] Mar 20 18:37:44 crc kubenswrapper[4795]: E0320 18:37:44.276064 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ef605a-8561-4538-9d09-b6635a813341" containerName="container-00" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.276078 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ef605a-8561-4538-9d09-b6635a813341" containerName="container-00" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.276244 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ef605a-8561-4538-9d09-b6635a813341" containerName="container-00" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.276940 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.421117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.421261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.523188 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.523367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.523476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.540745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"crc-debug-sz4jb\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.628403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:44 crc kubenswrapper[4795]: I0320 18:37:44.867490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" event={"ID":"dede02ab-ca23-4adf-9816-ac5ba6aa81b5","Type":"ContainerStarted","Data":"bd1eb6bb2bc7f7f800ad219a622e9dcf1061f7d09315438bdb06c0fefccbe684"} Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.883365 4795 generic.go:334] "Generic (PLEG): container finished" podID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerID="492c90a8d086b5d0bece8209730cbc8cb52f67859378e5062d5adb6e9f1149f8" exitCode=0 Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.883469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" event={"ID":"dede02ab-ca23-4adf-9816-ac5ba6aa81b5","Type":"ContainerDied","Data":"492c90a8d086b5d0bece8209730cbc8cb52f67859378e5062d5adb6e9f1149f8"} Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.924046 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-sz4jb"] Mar 20 18:37:45 crc kubenswrapper[4795]: I0320 18:37:45.934549 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/crc-debug-sz4jb"] Mar 20 18:37:46 crc kubenswrapper[4795]: I0320 18:37:46.982088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.168884 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") pod \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.169315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") pod \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\" (UID: \"dede02ab-ca23-4adf-9816-ac5ba6aa81b5\") " Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.169393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host" (OuterVolumeSpecName: "host") pod "dede02ab-ca23-4adf-9816-ac5ba6aa81b5" (UID: "dede02ab-ca23-4adf-9816-ac5ba6aa81b5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.169996 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.180516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r" (OuterVolumeSpecName: "kube-api-access-lfm2r") pod "dede02ab-ca23-4adf-9816-ac5ba6aa81b5" (UID: "dede02ab-ca23-4adf-9816-ac5ba6aa81b5"). InnerVolumeSpecName "kube-api-access-lfm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.264451 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" path="/var/lib/kubelet/pods/dede02ab-ca23-4adf-9816-ac5ba6aa81b5/volumes" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.278008 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfm2r\" (UniqueName: \"kubernetes.io/projected/dede02ab-ca23-4adf-9816-ac5ba6aa81b5-kube-api-access-lfm2r\") on node \"crc\" DevicePath \"\"" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.901210 4795 scope.go:117] "RemoveContainer" containerID="492c90a8d086b5d0bece8209730cbc8cb52f67859378e5062d5adb6e9f1149f8" Mar 20 18:37:47 crc kubenswrapper[4795]: I0320 18:37:47.901244 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/crc-debug-sz4jb" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.161569 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:38:00 crc kubenswrapper[4795]: E0320 18:38:00.164133 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerName="container-00" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.164306 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerName="container-00" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.164739 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dede02ab-ca23-4adf-9816-ac5ba6aa81b5" containerName="container-00" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.165632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.172269 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.172462 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.172721 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.176851 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.242642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"auto-csr-approver-29567198-hmxc4\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.344618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"auto-csr-approver-29567198-hmxc4\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.362498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"auto-csr-approver-29567198-hmxc4\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.491888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:00 crc kubenswrapper[4795]: I0320 18:38:00.966523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:38:01 crc kubenswrapper[4795]: I0320 18:38:01.020836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" event={"ID":"73777af0-dee3-47d4-a9d2-a48649e84e4d","Type":"ContainerStarted","Data":"9c6787e09d922da99c9dc942ad9c34c154b0df26e872c3ec0399f194f3baab37"} Mar 20 18:38:03 crc kubenswrapper[4795]: I0320 18:38:03.039455 4795 generic.go:334] "Generic (PLEG): container finished" podID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerID="6303fb5093a0bb0022b32e4ef548448b867a34002a11dd9fd46d7dd786dcfd17" exitCode=0 Mar 20 18:38:03 crc kubenswrapper[4795]: I0320 18:38:03.039556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" event={"ID":"73777af0-dee3-47d4-a9d2-a48649e84e4d","Type":"ContainerDied","Data":"6303fb5093a0bb0022b32e4ef548448b867a34002a11dd9fd46d7dd786dcfd17"} Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.436535 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.624848 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") pod \"73777af0-dee3-47d4-a9d2-a48649e84e4d\" (UID: \"73777af0-dee3-47d4-a9d2-a48649e84e4d\") " Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.636767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb" (OuterVolumeSpecName: "kube-api-access-tvvxb") pod "73777af0-dee3-47d4-a9d2-a48649e84e4d" (UID: "73777af0-dee3-47d4-a9d2-a48649e84e4d"). InnerVolumeSpecName "kube-api-access-tvvxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:38:04 crc kubenswrapper[4795]: I0320 18:38:04.728233 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvvxb\" (UniqueName: \"kubernetes.io/projected/73777af0-dee3-47d4-a9d2-a48649e84e4d-kube-api-access-tvvxb\") on node \"crc\" DevicePath \"\"" Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.073031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" event={"ID":"73777af0-dee3-47d4-a9d2-a48649e84e4d","Type":"ContainerDied","Data":"9c6787e09d922da99c9dc942ad9c34c154b0df26e872c3ec0399f194f3baab37"} Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.073079 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c6787e09d922da99c9dc942ad9c34c154b0df26e872c3ec0399f194f3baab37" Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.073141 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567198-hmxc4" Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.508593 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:38:05 crc kubenswrapper[4795]: I0320 18:38:05.518435 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567192-4p2jk"] Mar 20 18:38:07 crc kubenswrapper[4795]: I0320 18:38:07.278162 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cec1e8e-999e-44e2-a9b5-387a10c5de11" path="/var/lib/kubelet/pods/6cec1e8e-999e-44e2-a9b5-387a10c5de11/volumes" Mar 20 18:38:12 crc kubenswrapper[4795]: I0320 18:38:12.904486 4795 scope.go:117] "RemoveContainer" containerID="aef631dab3c0134ed5918f9313f1d6153c072f4d69296d8ee64df68955ab56a0" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.173511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.294024 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84776bb8f8-wkk7m_6dbcc77c-74d1-4b44-b0b6-bcb1297d6d97/barbican-api-log/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.364359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.507661 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76977cb5bb-84w8l_faa8c15c-b759-4db8-ac4d-28648a8cfde2/barbican-keystone-listener-log/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.569840 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker-log/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.580669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-558cc4f6c9-d6wp7_aa9d179b-7e78-4a37-80aa-3f3f6e7cabea/barbican-worker/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.876555 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-central-agent/0.log" Mar 20 18:38:33 crc kubenswrapper[4795]: I0320 18:38:33.962070 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/ceilometer-notification-agent/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.059775 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/proxy-httpd/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.065111 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_81c4fa02-a2cf-4349-afe3-292e38b50e33/sg-core/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.069677 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-65dps_0708214e-e711-465a-a54e-97a462b2777e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.277205 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api-log/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.297836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0b19426b-81a4-4498-9754-948e8b7154d9/cinder-api/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.471851 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/cinder-scheduler/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.529240 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8eeb13a5-ef36-44eb-9dfd-7798e9ad1620/probe/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.750270 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hprm9_2bad20c9-d77a-4c30-8fa2-979c05697cf4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:34 crc kubenswrapper[4795]: I0320 18:38:34.947287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-x6rwm_3d666090-1065-4b2d-9ac6-b84776b53d0a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.042966 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.218691 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/init/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.375468 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-ch8jm_5c5c2934-fe58-4707-9bb7-a5e2372bad83/dnsmasq-dns/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.512497 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8bm9k_b0af5324-4ba3-4a12-9fdb-b467918ba19d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.572298 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-log/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.582225 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_264c2db4-1919-41ce-aea3-bd777167a9ca/glance-httpd/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.718152 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-httpd/0.log" Mar 20 18:38:35 crc kubenswrapper[4795]: I0320 18:38:35.792406 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_81d40eb0-c26d-46e7-b8be-631de2f502b9/glance-log/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.046381 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.195604 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-vkrf5_0ab0ae7d-87ee-4e3f-a963-d126c5ddab8c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.529774 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7fb74ddb8-dbrvh_f3bb3cb2-7e6a-4c4d-9cb9-cd8d6683c109/horizon-log/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.782433 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-t26vc_cdfe5ffc-ab15-4277-966f-f506e725e8b1/keystone-cron/0.log" Mar 20 18:38:36 crc kubenswrapper[4795]: I0320 18:38:36.982182 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_72605c7d-99df-450f-900b-3022b0520149/kube-state-metrics/0.log" Mar 20 18:38:37 crc kubenswrapper[4795]: I0320 18:38:37.351332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85b996ff68-fdzxg_7b20a034-11f6-40ad-9447-32c49f705c07/keystone-api/0.log" Mar 20 18:38:37 crc kubenswrapper[4795]: I0320 18:38:37.420584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5rj55_20b330a0-830c-419e-81fe-a36dd1a32cc2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:38 crc kubenswrapper[4795]: I0320 18:38:38.038108 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-httpd/0.log" Mar 20 18:38:38 crc kubenswrapper[4795]: I0320 18:38:38.225235 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-t9lk7_e29f4857-ff0d-4806-ba09-74448200e8e2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:38 crc kubenswrapper[4795]: I0320 18:38:38.248461 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-649db44647-mrjns_5a472785-4467-4c97-93b9-e6f6eff19126/neutron-api/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.261522 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5916e4d2-2863-4088-be97-cf368906820b/nova-cell0-conductor-conductor/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.668178 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-log/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.740429 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rp77q_b6da9d2a-e18f-4994-b8f3-6b1eb969564b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:39 crc kubenswrapper[4795]: I0320 18:38:39.878438 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_19c15c93-572c-4d53-b924-172f3ad29c8a/nova-cell1-conductor-conductor/0.log" Mar 20 18:38:40 crc kubenswrapper[4795]: I0320 18:38:40.148504 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d2a5e398-6d25-43b1-8c29-407af2d9348b/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:38:40 crc kubenswrapper[4795]: I0320 18:38:40.436204 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_480a6609-0395-4bda-9ec8-a3ebf30931a7/nova-api-api/0.log" Mar 20 18:38:40 crc kubenswrapper[4795]: I0320 18:38:40.452918 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-log/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.069193 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.171746 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff4af01b-01b5-4154-8591-7ec99e3d6ef0/nova-metadata-metadata/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.249538 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c23f56ff-eceb-4891-87e5-57ebeb7eba8d/nova-scheduler-scheduler/0.log" Mar 20 18:38:41 crc kubenswrapper[4795]: I0320 18:38:41.935964 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-kc4wx_709f5080-c511-4d3b-bc9c-baeec85fa245/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.162651 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/galera/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.177076 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6c96c5e9-4c52-4a2c-b8ef-51cd8ba41987/mysql-bootstrap/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.253329 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.482972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/galera/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.511185 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_cf3f8aea-393e-418a-ad14-2848c8df93e9/openstackclient/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.536863 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f5a24ef-fc80-4386-9f81-5f21154223f3/mysql-bootstrap/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.690736 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dnp2g_28df10bb-d6a9-47a9-9b79-0bb9665529ef/ovn-controller/0.log" Mar 20 18:38:42 crc kubenswrapper[4795]: I0320 18:38:42.751943 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-n4gzx_85004117-20bc-474e-88f5-ce49032749ff/openstack-network-exporter/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.061136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.750085 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server-init/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.754303 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovs-vswitchd/0.log" Mar 20 18:38:43 crc kubenswrapper[4795]: I0320 18:38:43.820255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dsqcc_5675bf5e-3a57-4082-8631-680ced6fb634/ovsdb-server/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.024917 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/openstack-network-exporter/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.137708 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6cfc9397-7268-4bd1-8bbf-d107e94ab35a/ovn-northd/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.214590 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9jw45_6c737290-0616-475b-a839-cca387d8d90d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.258501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/openstack-network-exporter/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.326850 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3b9d4ac2-2b66-441a-a6d4-0d467d857f99/ovsdbserver-nb/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.494943 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/openstack-network-exporter/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.497213 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c07f346e-3e6c-41a5-bdda-67a4a5f04ba7/ovsdbserver-sb/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.819854 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-api/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.860453 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:38:44 crc kubenswrapper[4795]: I0320 18:38:44.912277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-fc784f9bb-wjct6_48841a5b-142c-49d0-8e87-8562f8d1f824/placement-log/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.069863 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/setup-container/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.146765 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_30c1ffc4-752a-4b0a-a95b-2bfbc458dc53/rabbitmq/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.155465 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.403743 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/setup-container/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.428544 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad841f4d-fa5f-4383-86d5-ab5a93f6e7fc/rabbitmq/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.488883 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fpv88_1f1dc7c7-9276-4c0a-a24b-2c7b303d59e4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.655870 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tx6d9_d7dc5d37-6d24-48ea-acc1-2b4ed3de6936/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.704956 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-bnskk_e7bbeb5c-0f49-4fb3-b0b4-57c9bf91977e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:45 crc kubenswrapper[4795]: I0320 18:38:45.981900 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-prfq6_9cdb4943-60a1-41cc-aead-1702a4c1f68a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.004098 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-j6rls_80cf5a83-936d-4789-a7bc-b91cdb0e564d/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.202040 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-server/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.291458 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6697f55ff5-fj55x_e7d2be98-dcc2-408e-9da2-4f1d3b83b2b6/proxy-httpd/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.300062 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m8zw5_2c422574-0103-4c97-9e23-5a78c5b44e69/swift-ring-rebalance/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.495493 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-auditor/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.559941 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-reaper/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.587023 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-replicator/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.850816 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ca95ec62-fce9-4c91-bb59-fa80f512edba/memcached/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.852315 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-auditor/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.860481 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/account-server/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.922229 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-replicator/0.log" Mar 20 18:38:46 crc kubenswrapper[4795]: I0320 18:38:46.953071 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-server/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.029359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/container-updater/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.077944 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-expirer/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.098230 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-auditor/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.141793 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-server/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.163889 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-replicator/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.280316 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/swift-recon-cron/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.281705 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/rsync/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.291809 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_88e955e5-ba7a-4582-9d52-40333fe21b7f/object-updater/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.574628 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_caaf60a5-8c45-4831-8d26-8cf808f1da7a/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.652933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3ec75200-ce24-4bfd-9bf3-df2cd9d7d6a0/test-operator-logs-container/0.log" Mar 20 18:38:47 crc kubenswrapper[4795]: I0320 18:38:47.825872 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vgxx5_35b4aa82-d668-474b-b54d-b540190f5a6c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:38:48 crc kubenswrapper[4795]: I0320 18:38:48.212998 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nwfsh_d519d04c-89f1-46b7-8136-1a9596af73ac/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:39:11 crc kubenswrapper[4795]: I0320 18:39:11.300471 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:39:11 crc kubenswrapper[4795]: I0320 18:39:11.301867 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.532106 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.700329 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.709123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.744019 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.906985 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/util/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.914609 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/pull/0.log" Mar 20 18:39:14 crc kubenswrapper[4795]: I0320 18:39:14.929123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f28zrz5_f7571109-7ce9-44a8-9275-4af4fadbd0e6/extract/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.192596 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-5hzvs_afefdb79-bad6-4deb-904b-515174cca414/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.312671 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-jgs27_43804d6b-2358-46fd-bf04-26b2308f8ab0/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.506655 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-dwx6n_a957ef3d-357c-4aa4-865c-533f889257d7/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.610584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-rmcrf_4cdd16c5-b7d3-4c52-a286-f3555daf43d9/manager/0.log" Mar 20 18:39:15 crc kubenswrapper[4795]: I0320 18:39:15.769360 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-f74p9_ded84ba8-d70a-4379-bc80-d142e5306cc7/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.082765 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-55vp5_9cba9cd3-4144-4262-82a2-f2330793aae6/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.195511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6f8b7f6fdf-lrjfh_fc0f2e63-50dd-424e-af01-3d09c9edd5b3/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.379264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-6hsxn_84901a7b-ddbf-47d9-954f-c167cd9cd46c/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.426560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-trjt4_7a887d91-fa86-45d2-a6be-aa7326f7d544/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.732258 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-jfdzb_071f0af8-4164-4f95-b0ee-720e3b3097f3/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.894440 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-h9f9t_21481bba-04ec-47ce-95d0-fe27787a3d62/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.901597 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-bqzcz_0ffe016b-8919-4b8f-839c-669637b7accc/manager/0.log" Mar 20 18:39:16 crc kubenswrapper[4795]: I0320 18:39:16.995297 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-5v5sg_0da03e08-561c-4b5f-89c7-af80c8f39f54/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.055367 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-n7cl7_d4ff6977-1303-4267-983e-3e99935f2aae/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.158647 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f557zsq_a0fa84d9-bfa8-4b4a-82d7-51e5ae87e0d2/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.310121 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b67cc5c9-vm29j_084071f5-e58b-451b-9cf5-67203ae1ba02/operator/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.518722 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b6ckg_3aeffd27-d2c7-4744-8e01-07a4db74597e/registry-server/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.721392 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-dtfmz_84a19583-b173-4fb9-8b83-d9c41a5faf79/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.791358 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-6cw7v_b47e6216-2e29-4d58-8b0c-5970aee6307b/manager/0.log" Mar 20 18:39:17 crc kubenswrapper[4795]: I0320 18:39:17.963597 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-828jr_750d9405-0514-4876-821e-9ab1f6871e87/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.222875 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-jbwss_46248665-6f9f-46e0-8db7-6be8c47cf521/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.282038 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-rv5df_e13cbad4-3a2f-4b3c-82d8-c3984c5a9f38/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.483560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56f44579c8-px2ft_0d8b26db-957e-4c0e-bb22-42f12d5beb0b/manager/0.log" Mar 20 18:39:18 crc kubenswrapper[4795]: I0320 18:39:18.498742 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-6z7j5_933bcfd5-f2d1-404f-876d-1d3da597f415/manager/0.log" Mar 20 18:39:40 crc kubenswrapper[4795]: I0320 18:39:40.096169 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-knn77_cd9b8a97-1b9d-4365-a985-a02d4078e3c2/control-plane-machine-set-operator/0.log" Mar 20 18:39:40 crc kubenswrapper[4795]: I0320 18:39:40.283482 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/machine-api-operator/0.log" Mar 20 18:39:40 crc kubenswrapper[4795]: I0320 18:39:40.294239 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-p5hmr_9f31b9ac-9447-4b20-ac60-7532edfa4600/kube-rbac-proxy/0.log" Mar 20 18:39:41 crc kubenswrapper[4795]: I0320 18:39:41.300639 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:39:41 crc kubenswrapper[4795]: I0320 18:39:41.300740 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:39:54 crc kubenswrapper[4795]: I0320 18:39:54.842856 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-lqmsr_5231a25a-8bda-4f72-8a81-e5a49cdc31eb/cert-manager-controller/0.log" Mar 20 18:39:54 crc kubenswrapper[4795]: I0320 18:39:54.942421 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-smr2n_7df834a3-0298-4cc9-8b4e-49ce3f51183e/cert-manager-cainjector/0.log" Mar 20 18:39:55 crc kubenswrapper[4795]: I0320 18:39:55.035754 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cff8c_88832f68-9f72-4321-8d3f-bb3e23465fdb/cert-manager-webhook/0.log" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.140381 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:40:00 crc kubenswrapper[4795]: E0320 18:40:00.141363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerName="oc" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.141376 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerName="oc" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.141546 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" containerName="oc" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.142157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.144197 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.144321 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.145705 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.148650 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.245390 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.247223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.262785 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.265271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"auto-csr-approver-29567200-sb6x8\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367627 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.367647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"auto-csr-approver-29567200-sb6x8\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.391171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"auto-csr-approver-29567200-sb6x8\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.459671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.470084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.470189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.470298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.471016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.471351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.494009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"redhat-marketplace-hk5bn\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.566362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:00 crc kubenswrapper[4795]: W0320 18:40:00.923412 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b0a7d2_ac09_4b84_8083_48c33d97b032.slice/crio-e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30 WatchSource:0}: Error finding container e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30: Status 404 returned error can't find the container with id e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30 Mar 20 18:40:00 crc kubenswrapper[4795]: I0320 18:40:00.924325 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:40:01 crc kubenswrapper[4795]: W0320 18:40:01.121097 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c84a44_9022_49e8_bc90_cf827381767d.slice/crio-be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46 WatchSource:0}: Error finding container be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46: Status 404 returned error can't find the container with id be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46 Mar 20 18:40:01 crc kubenswrapper[4795]: I0320 18:40:01.127916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:01 crc kubenswrapper[4795]: I0320 18:40:01.160717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" event={"ID":"37b0a7d2-ac09-4b84-8083-48c33d97b032","Type":"ContainerStarted","Data":"e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30"} Mar 20 18:40:01 crc kubenswrapper[4795]: I0320 18:40:01.161560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerStarted","Data":"be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46"} Mar 20 18:40:02 crc kubenswrapper[4795]: I0320 18:40:02.170956 4795 generic.go:334] "Generic (PLEG): container finished" podID="91c84a44-9022-49e8-bc90-cf827381767d" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" exitCode=0 Mar 20 18:40:02 crc kubenswrapper[4795]: I0320 18:40:02.171047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9"} Mar 20 18:40:03 crc kubenswrapper[4795]: I0320 18:40:03.182670 4795 generic.go:334] "Generic (PLEG): container finished" podID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerID="f6bd6f59f3702ed1b553664f1b56ecda875d8c79fa2d467d34f79e36c2a97634" exitCode=0 Mar 20 18:40:03 crc kubenswrapper[4795]: I0320 18:40:03.182742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" event={"ID":"37b0a7d2-ac09-4b84-8083-48c33d97b032","Type":"ContainerDied","Data":"f6bd6f59f3702ed1b553664f1b56ecda875d8c79fa2d467d34f79e36c2a97634"} Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.195179 4795 generic.go:334] "Generic (PLEG): container finished" podID="91c84a44-9022-49e8-bc90-cf827381767d" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" exitCode=0 Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.195282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f"} Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.545325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.679498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") pod \"37b0a7d2-ac09-4b84-8083-48c33d97b032\" (UID: \"37b0a7d2-ac09-4b84-8083-48c33d97b032\") " Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.684780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96" (OuterVolumeSpecName: "kube-api-access-z4x96") pod "37b0a7d2-ac09-4b84-8083-48c33d97b032" (UID: "37b0a7d2-ac09-4b84-8083-48c33d97b032"). InnerVolumeSpecName "kube-api-access-z4x96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:04 crc kubenswrapper[4795]: I0320 18:40:04.781925 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4x96\" (UniqueName: \"kubernetes.io/projected/37b0a7d2-ac09-4b84-8083-48c33d97b032-kube-api-access-z4x96\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.209752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" event={"ID":"37b0a7d2-ac09-4b84-8083-48c33d97b032","Type":"ContainerDied","Data":"e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30"} Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.209814 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b70b1af941cb644eabc70b8cfead3e685e5060c5dc38d150e7588c99f5cf30" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.209889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567200-sb6x8" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.218529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerStarted","Data":"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320"} Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.240376 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hk5bn" podStartSLOduration=2.790136796 podStartE2EDuration="5.240357253s" podCreationTimestamp="2026-03-20 18:40:00 +0000 UTC" firstStartedPulling="2026-03-20 18:40:02.173233466 +0000 UTC m=+4945.631265007" lastFinishedPulling="2026-03-20 18:40:04.623453903 +0000 UTC m=+4948.081485464" observedRunningTime="2026-03-20 18:40:05.240312351 +0000 UTC m=+4948.698343922" watchObservedRunningTime="2026-03-20 18:40:05.240357253 +0000 UTC m=+4948.698388794" Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.616053 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:40:05 crc kubenswrapper[4795]: I0320 18:40:05.625275 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567194-t7ml6"] Mar 20 18:40:07 crc kubenswrapper[4795]: I0320 18:40:07.266209 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce7186d-a505-4b16-ae93-2d95886d5f2d" path="/var/lib/kubelet/pods/7ce7186d-a505-4b16-ae93-2d95886d5f2d/volumes" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.567164 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.567606 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.618143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.636710 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-5jfjl_d34761db-41bf-4e5f-bdca-8c25e281c924/nmstate-console-plugin/0.log" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.834831 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bsp49_e070281f-65f5-4c6d-b012-06c027393646/nmstate-handler/0.log" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.889369 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/kube-rbac-proxy/0.log" Mar 20 18:40:10 crc kubenswrapper[4795]: I0320 18:40:10.978065 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-xjj2s_65c42497-77ba-49bc-a292-5003a353fde6/nmstate-metrics/0.log" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.069532 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-dlcps_efca4120-31ef-4c52-a6da-59b33144a979/nmstate-operator/0.log" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.163923 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mjhsq_f50011ef-d180-4d84-ba10-a2da522a579d/nmstate-webhook/0.log" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.300501 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.300550 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.300589 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.301301 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.301351 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e" gracePeriod=600 Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.309083 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:11 crc kubenswrapper[4795]: I0320 18:40:11.365149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.274676 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e" exitCode=0 Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.274779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e"} Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.275784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerStarted","Data":"4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8"} Mar 20 18:40:12 crc kubenswrapper[4795]: I0320 18:40:12.275804 4795 scope.go:117] "RemoveContainer" containerID="f3798a47f257dc8f94eadec50f15f6f1b9682e3794efdf58c9c685796537b761" Mar 20 18:40:13 crc kubenswrapper[4795]: I0320 18:40:13.032005 4795 scope.go:117] "RemoveContainer" containerID="640523aab60a0eb7070cfd04917b68ca823bf4802ffb825df958dcc7af70501e" Mar 20 18:40:13 crc kubenswrapper[4795]: I0320 18:40:13.287137 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hk5bn" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" containerID="cri-o://48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" gracePeriod=2 Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.290793 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297384 4795 generic.go:334] "Generic (PLEG): container finished" podID="91c84a44-9022-49e8-bc90-cf827381767d" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" exitCode=0 Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320"} Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk5bn" event={"ID":"91c84a44-9022-49e8-bc90-cf827381767d","Type":"ContainerDied","Data":"be5711102151c4f8ff340c5a139a8ad16496aa629f2910ce8d0f45e90941af46"} Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297460 4795 scope.go:117] "RemoveContainer" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.297551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk5bn" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.328625 4795 scope.go:117] "RemoveContainer" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359067 4795 scope.go:117] "RemoveContainer" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") pod \"91c84a44-9022-49e8-bc90-cf827381767d\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") pod \"91c84a44-9022-49e8-bc90-cf827381767d\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.359565 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") pod \"91c84a44-9022-49e8-bc90-cf827381767d\" (UID: \"91c84a44-9022-49e8-bc90-cf827381767d\") " Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.360594 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities" (OuterVolumeSpecName: "utilities") pod "91c84a44-9022-49e8-bc90-cf827381767d" (UID: "91c84a44-9022-49e8-bc90-cf827381767d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.369159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz" (OuterVolumeSpecName: "kube-api-access-qbwhz") pod "91c84a44-9022-49e8-bc90-cf827381767d" (UID: "91c84a44-9022-49e8-bc90-cf827381767d"). InnerVolumeSpecName "kube-api-access-qbwhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.383238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91c84a44-9022-49e8-bc90-cf827381767d" (UID: "91c84a44-9022-49e8-bc90-cf827381767d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.443906 4795 scope.go:117] "RemoveContainer" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" Mar 20 18:40:14 crc kubenswrapper[4795]: E0320 18:40:14.444381 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320\": container with ID starting with 48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320 not found: ID does not exist" containerID="48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444408 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320"} err="failed to get container status \"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320\": rpc error: code = NotFound desc = could not find container \"48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320\": container with ID starting with 48eb5c6d0f40df592603b557aa81e8f02a16365e660f6d3f9fddb31e6d5f7320 not found: ID does not exist" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444454 4795 scope.go:117] "RemoveContainer" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" Mar 20 18:40:14 crc kubenswrapper[4795]: E0320 18:40:14.444705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f\": container with ID starting with 0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f not found: ID does not exist" containerID="0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444725 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f"} err="failed to get container status \"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f\": rpc error: code = NotFound desc = could not find container \"0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f\": container with ID starting with 0ededdf9ae055bf38bf156c08889d34e67b65dfbd06e3632badf12df4de9f11f not found: ID does not exist" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444737 4795 scope.go:117] "RemoveContainer" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" Mar 20 18:40:14 crc kubenswrapper[4795]: E0320 18:40:14.444939 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9\": container with ID starting with dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9 not found: ID does not exist" containerID="dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.444958 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9"} err="failed to get container status \"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9\": rpc error: code = NotFound desc = could not find container \"dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9\": container with ID starting with dcc47f217be71a926c24afc7f5a3f9e1bd0a1b39dcf025284576d44faf2bf9e9 not found: ID does not exist" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.461275 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwhz\" (UniqueName: \"kubernetes.io/projected/91c84a44-9022-49e8-bc90-cf827381767d-kube-api-access-qbwhz\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.461307 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.461319 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c84a44-9022-49e8-bc90-cf827381767d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.629006 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:14 crc kubenswrapper[4795]: I0320 18:40:14.637442 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk5bn"] Mar 20 18:40:15 crc kubenswrapper[4795]: I0320 18:40:15.270339 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c84a44-9022-49e8-bc90-cf827381767d" path="/var/lib/kubelet/pods/91c84a44-9022-49e8-bc90-cf827381767d/volumes" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.628926 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630170 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-utilities" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630194 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-utilities" Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630231 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630243 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerName="oc" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630292 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerName="oc" Mar 20 18:40:20 crc kubenswrapper[4795]: E0320 18:40:20.630342 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-content" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630353 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="extract-content" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630622 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c84a44-9022-49e8-bc90-cf827381767d" containerName="registry-server" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.630668 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" containerName="oc" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.632867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.679556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.774881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.774931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.775636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.877357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.877541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.877571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.878136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.878154 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.905458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"certified-operators-5hcgx\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:20 crc kubenswrapper[4795]: I0320 18:40:20.979604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:21 crc kubenswrapper[4795]: I0320 18:40:21.479132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.411035 4795 generic.go:334] "Generic (PLEG): container finished" podID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" exitCode=0 Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.411124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5"} Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.411477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerStarted","Data":"11bb4e2bc49835e6a2d6bce0bd30624c0604522ef2556b61ca172297f3444edc"} Mar 20 18:40:22 crc kubenswrapper[4795]: I0320 18:40:22.414169 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:40:23 crc kubenswrapper[4795]: I0320 18:40:23.438894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerStarted","Data":"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2"} Mar 20 18:40:24 crc kubenswrapper[4795]: I0320 18:40:24.452425 4795 generic.go:334] "Generic (PLEG): container finished" podID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" exitCode=0 Mar 20 18:40:24 crc kubenswrapper[4795]: I0320 18:40:24.452493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2"} Mar 20 18:40:25 crc kubenswrapper[4795]: I0320 18:40:25.474912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerStarted","Data":"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974"} Mar 20 18:40:25 crc kubenswrapper[4795]: I0320 18:40:25.500554 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5hcgx" podStartSLOduration=3.045631175 podStartE2EDuration="5.500535257s" podCreationTimestamp="2026-03-20 18:40:20 +0000 UTC" firstStartedPulling="2026-03-20 18:40:22.413942247 +0000 UTC m=+4965.871973788" lastFinishedPulling="2026-03-20 18:40:24.868846329 +0000 UTC m=+4968.326877870" observedRunningTime="2026-03-20 18:40:25.498096542 +0000 UTC m=+4968.956128083" watchObservedRunningTime="2026-03-20 18:40:25.500535257 +0000 UTC m=+4968.958566798" Mar 20 18:40:30 crc kubenswrapper[4795]: I0320 18:40:30.980818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:30 crc kubenswrapper[4795]: I0320 18:40:30.981477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:31 crc kubenswrapper[4795]: I0320 18:40:31.725940 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:31 crc kubenswrapper[4795]: I0320 18:40:31.775845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:31 crc kubenswrapper[4795]: I0320 18:40:31.970653 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:33 crc kubenswrapper[4795]: I0320 18:40:33.555680 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5hcgx" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" containerID="cri-o://3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" gracePeriod=2 Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.099140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.238274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") pod \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.238581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") pod \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.238698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") pod \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\" (UID: \"111e8972-3b13-46b6-b3ff-fdcfb3edd832\") " Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.241620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities" (OuterVolumeSpecName: "utilities") pod "111e8972-3b13-46b6-b3ff-fdcfb3edd832" (UID: "111e8972-3b13-46b6-b3ff-fdcfb3edd832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.247259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9" (OuterVolumeSpecName: "kube-api-access-85qs9") pod "111e8972-3b13-46b6-b3ff-fdcfb3edd832" (UID: "111e8972-3b13-46b6-b3ff-fdcfb3edd832"). InnerVolumeSpecName "kube-api-access-85qs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.296221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "111e8972-3b13-46b6-b3ff-fdcfb3edd832" (UID: "111e8972-3b13-46b6-b3ff-fdcfb3edd832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.341155 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qs9\" (UniqueName: \"kubernetes.io/projected/111e8972-3b13-46b6-b3ff-fdcfb3edd832-kube-api-access-85qs9\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.341187 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.341195 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111e8972-3b13-46b6-b3ff-fdcfb3edd832-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.574974 4795 generic.go:334] "Generic (PLEG): container finished" podID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" exitCode=0 Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974"} Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5hcgx" event={"ID":"111e8972-3b13-46b6-b3ff-fdcfb3edd832","Type":"ContainerDied","Data":"11bb4e2bc49835e6a2d6bce0bd30624c0604522ef2556b61ca172297f3444edc"} Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575096 4795 scope.go:117] "RemoveContainer" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.575317 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5hcgx" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.605085 4795 scope.go:117] "RemoveContainer" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.620073 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.625797 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5hcgx"] Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.630888 4795 scope.go:117] "RemoveContainer" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.681803 4795 scope.go:117] "RemoveContainer" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" Mar 20 18:40:34 crc kubenswrapper[4795]: E0320 18:40:34.682202 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974\": container with ID starting with 3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974 not found: ID does not exist" containerID="3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682235 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974"} err="failed to get container status \"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974\": rpc error: code = NotFound desc = could not find container \"3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974\": container with ID starting with 3a71d40f257fd6abc906092b013f42d7274054ff5fd7144dfaf8179ad87a5974 not found: ID does not exist" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682255 4795 scope.go:117] "RemoveContainer" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" Mar 20 18:40:34 crc kubenswrapper[4795]: E0320 18:40:34.682578 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2\": container with ID starting with 4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2 not found: ID does not exist" containerID="4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682600 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2"} err="failed to get container status \"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2\": rpc error: code = NotFound desc = could not find container \"4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2\": container with ID starting with 4e87fb7ba162a0d6c81ddad8c08beb61dede147424142d8558befb3cc374aea2 not found: ID does not exist" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682613 4795 scope.go:117] "RemoveContainer" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" Mar 20 18:40:34 crc kubenswrapper[4795]: E0320 18:40:34.682972 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5\": container with ID starting with d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5 not found: ID does not exist" containerID="d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5" Mar 20 18:40:34 crc kubenswrapper[4795]: I0320 18:40:34.682996 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5"} err="failed to get container status \"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5\": rpc error: code = NotFound desc = could not find container \"d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5\": container with ID starting with d4763cfc47ccc5d7ea98e5252c63ad8227c66bc7a8d9f4d43f8ce050e8618fc5 not found: ID does not exist" Mar 20 18:40:35 crc kubenswrapper[4795]: I0320 18:40:35.271796 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" path="/var/lib/kubelet/pods/111e8972-3b13-46b6-b3ff-fdcfb3edd832/volumes" Mar 20 18:40:41 crc kubenswrapper[4795]: I0320 18:40:41.387446 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/kube-rbac-proxy/0.log" Mar 20 18:40:41 crc kubenswrapper[4795]: I0320 18:40:41.562641 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kvtc5_2ce06e1f-5454-4b85-888b-3230c0086c2e/controller/0.log" Mar 20 18:40:41 crc kubenswrapper[4795]: I0320 18:40:41.939159 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.121647 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.121865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.129690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.153161 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.311954 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.368951 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.378547 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.403988 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.463974 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-frr-files/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.538758 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.566264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/cp-reloader/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.601560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/controller/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.721164 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr-metrics/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.752981 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.804674 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/kube-rbac-proxy-frr/0.log" Mar 20 18:40:42 crc kubenswrapper[4795]: I0320 18:40:42.913187 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/reloader/0.log" Mar 20 18:40:43 crc kubenswrapper[4795]: I0320 18:40:43.075439 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jw5dk_377dbbb7-0571-40cd-9fe3-3c86fbf4f092/frr-k8s-webhook-server/0.log" Mar 20 18:40:43 crc kubenswrapper[4795]: I0320 18:40:43.162048 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7547f4d8c8-499mj_0e8dba8d-8387-4ced-ac54-b8d5e1cf3650/manager/0.log" Mar 20 18:40:43 crc kubenswrapper[4795]: I0320 18:40:43.906123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5ffc48dc7-t9vwn_2d29ac93-da31-4834-a858-d5bd9adb28d1/webhook-server/0.log" Mar 20 18:40:44 crc kubenswrapper[4795]: I0320 18:40:44.030811 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/kube-rbac-proxy/0.log" Mar 20 18:40:44 crc kubenswrapper[4795]: I0320 18:40:44.407310 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-66lbd_a748ee28-0a26-4700-b384-3afa65b8ac9d/frr/0.log" Mar 20 18:40:44 crc kubenswrapper[4795]: I0320 18:40:44.483157 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bl9qp_8834c8fc-36f7-41da-867f-ec5a32e25b36/speaker/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.152076 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.381310 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.386940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.422429 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.593994 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.601478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.607243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljf9h_cf16b9b7-bdbf-48db-a358-3c32c93b3d43/extract/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.765027 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.941220 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.943590 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:40:58 crc kubenswrapper[4795]: I0320 18:40:58.972299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.103108 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/util/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.124191 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/extract/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.125405 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c17fkjk_6d525cd0-41ce-4352-8ce0-8f24113c89d0/pull/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.268095 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.431287 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.467285 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.485933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.648818 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-utilities/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.672365 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/extract-content/0.log" Mar 20 18:40:59 crc kubenswrapper[4795]: I0320 18:40:59.910524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.108829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.126068 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.128739 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.196147 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-965kf_58169b5e-ad5e-4928-8511-1677518e9c01/registry-server/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.280524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.307740 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.505410 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8x76m_a2de2777-57e1-4310-a878-1cfc1fc77e44/marketplace-operator/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.615880 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.814007 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-gtd2l_c64cefc5-9e5f-4179-9e8a-e6a6a09af8f0/registry-server/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.841982 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.851742 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:41:00 crc kubenswrapper[4795]: I0320 18:41:00.851750 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.045530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.125872 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/extract-content/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.216217 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nwm6j_38a67438-04e3-433b-9b32-47acf98b3086/registry-server/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.262841 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.453741 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.514264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.525209 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.664115 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-utilities/0.log" Mar 20 18:41:01 crc kubenswrapper[4795]: I0320 18:41:01.699722 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/extract-content/0.log" Mar 20 18:41:02 crc kubenswrapper[4795]: I0320 18:41:02.221141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n22t9_fb801735-41d3-4c6e-b9e7-083ad510100a/registry-server/0.log" Mar 20 18:41:22 crc kubenswrapper[4795]: E0320 18:41:22.562734 4795 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.58:37962->38.102.83.58:45419: write tcp 38.102.83.58:37962->38.102.83.58:45419: write: broken pipe Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.160729 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567202-mx5d7"] Mar 20 18:42:00 crc kubenswrapper[4795]: E0320 18:42:00.161556 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-utilities" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-utilities" Mar 20 18:42:00 crc kubenswrapper[4795]: E0320 18:42:00.161589 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161595 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" Mar 20 18:42:00 crc kubenswrapper[4795]: E0320 18:42:00.161607 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-content" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161613 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="extract-content" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.161792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="111e8972-3b13-46b6-b3ff-fdcfb3edd832" containerName="registry-server" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.162728 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.165651 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.165942 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.165993 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.173790 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-mx5d7"] Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.355742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"auto-csr-approver-29567202-mx5d7\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.458708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"auto-csr-approver-29567202-mx5d7\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.488356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"auto-csr-approver-29567202-mx5d7\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:00 crc kubenswrapper[4795]: I0320 18:42:00.784964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:01 crc kubenswrapper[4795]: I0320 18:42:01.350122 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567202-mx5d7"] Mar 20 18:42:01 crc kubenswrapper[4795]: I0320 18:42:01.430287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" event={"ID":"08e41958-5524-4e61-8976-654c68baf648","Type":"ContainerStarted","Data":"91d76a03da65f2314c73d85ea90c8b152c1b273fd40f50eb606305821f6612de"} Mar 20 18:42:03 crc kubenswrapper[4795]: I0320 18:42:03.450459 4795 generic.go:334] "Generic (PLEG): container finished" podID="08e41958-5524-4e61-8976-654c68baf648" containerID="a60cfe51f17f419369391cdefbf7858f781a1f44daab783e763ce1f4f4ed5586" exitCode=0 Mar 20 18:42:03 crc kubenswrapper[4795]: I0320 18:42:03.450616 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" event={"ID":"08e41958-5524-4e61-8976-654c68baf648","Type":"ContainerDied","Data":"a60cfe51f17f419369391cdefbf7858f781a1f44daab783e763ce1f4f4ed5586"} Mar 20 18:42:04 crc kubenswrapper[4795]: I0320 18:42:04.824159 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:04 crc kubenswrapper[4795]: I0320 18:42:04.955510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") pod \"08e41958-5524-4e61-8976-654c68baf648\" (UID: \"08e41958-5524-4e61-8976-654c68baf648\") " Mar 20 18:42:04 crc kubenswrapper[4795]: I0320 18:42:04.971739 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq" (OuterVolumeSpecName: "kube-api-access-2rsxq") pod "08e41958-5524-4e61-8976-654c68baf648" (UID: "08e41958-5524-4e61-8976-654c68baf648"). InnerVolumeSpecName "kube-api-access-2rsxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.058558 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rsxq\" (UniqueName: \"kubernetes.io/projected/08e41958-5524-4e61-8976-654c68baf648-kube-api-access-2rsxq\") on node \"crc\" DevicePath \"\"" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.478661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" event={"ID":"08e41958-5524-4e61-8976-654c68baf648","Type":"ContainerDied","Data":"91d76a03da65f2314c73d85ea90c8b152c1b273fd40f50eb606305821f6612de"} Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.478733 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91d76a03da65f2314c73d85ea90c8b152c1b273fd40f50eb606305821f6612de" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.478796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567202-mx5d7" Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.912937 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:42:05 crc kubenswrapper[4795]: I0320 18:42:05.923630 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567196-t9x4q"] Mar 20 18:42:07 crc kubenswrapper[4795]: I0320 18:42:07.273259 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7203100a-018c-4662-a760-a16bd5c6322d" path="/var/lib/kubelet/pods/7203100a-018c-4662-a760-a16bd5c6322d/volumes" Mar 20 18:42:11 crc kubenswrapper[4795]: I0320 18:42:11.300553 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:42:11 crc kubenswrapper[4795]: I0320 18:42:11.301111 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:42:13 crc kubenswrapper[4795]: I0320 18:42:13.181879 4795 scope.go:117] "RemoveContainer" containerID="dd786a69be248d53a6715fb536c79a06b01be09807a6bb21bbca9e7786db827c" Mar 20 18:42:41 crc kubenswrapper[4795]: I0320 18:42:41.300142 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:42:41 crc kubenswrapper[4795]: I0320 18:42:41.300803 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.300582 4795 patch_prober.go:28] interesting pod/machine-config-daemon-mvxvt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.301427 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.301508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.302811 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8"} pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:43:11 crc kubenswrapper[4795]: I0320 18:43:11.302925 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" containerName="machine-config-daemon" containerID="cri-o://4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" gracePeriod=600 Mar 20 18:43:11 crc kubenswrapper[4795]: E0320 18:43:11.434725 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.285665 4795 generic.go:334] "Generic (PLEG): container finished" podID="8702afd1-abd3-42d0-91e6-048802e98829" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" exitCode=0 Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.286043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" event={"ID":"8702afd1-abd3-42d0-91e6-048802e98829","Type":"ContainerDied","Data":"4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8"} Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.286086 4795 scope.go:117] "RemoveContainer" containerID="55e3694e8c466dc1a3306501d1ffee909e405d4eec50b34950abe11c9ddb789e" Mar 20 18:43:12 crc kubenswrapper[4795]: I0320 18:43:12.287011 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:12 crc kubenswrapper[4795]: E0320 18:43:12.287368 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:24 crc kubenswrapper[4795]: I0320 18:43:24.252460 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:24 crc kubenswrapper[4795]: E0320 18:43:24.253384 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.464621 4795 generic.go:334] "Generic (PLEG): container finished" podID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerID="f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f" exitCode=0 Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.464759 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" event={"ID":"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e","Type":"ContainerDied","Data":"f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f"} Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.465620 4795 scope.go:117] "RemoveContainer" containerID="f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f" Mar 20 18:43:25 crc kubenswrapper[4795]: I0320 18:43:25.996183 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/gather/0.log" Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.253594 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:39 crc kubenswrapper[4795]: E0320 18:43:39.254328 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.929368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.929984 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" containerID="cri-o://7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22" gracePeriod=2 Mar 20 18:43:39 crc kubenswrapper[4795]: I0320 18:43:39.940254 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4zsvz/must-gather-k5nt7"] Mar 20 18:43:40 crc kubenswrapper[4795]: I0320 18:43:40.636345 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/copy/0.log" Mar 20 18:43:40 crc kubenswrapper[4795]: I0320 18:43:40.637155 4795 generic.go:334] "Generic (PLEG): container finished" podID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerID="7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22" exitCode=143 Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.014258 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/copy/0.log" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.014909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.180742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") pod \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.180884 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") pod \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\" (UID: \"6e72ecdf-d9fb-433e-b04d-4807b5e60c2e\") " Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.189410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp" (OuterVolumeSpecName: "kube-api-access-5wwcp") pod "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" (UID: "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e"). InnerVolumeSpecName "kube-api-access-5wwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.283196 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wwcp\" (UniqueName: \"kubernetes.io/projected/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-kube-api-access-5wwcp\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.451578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" (UID: "6e72ecdf-d9fb-433e-b04d-4807b5e60c2e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.488406 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.648753 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4zsvz_must-gather-k5nt7_6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/copy/0.log" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.649278 4795 scope.go:117] "RemoveContainer" containerID="7e428e78fa28b099376d48c4437d22fc8fb058f496bc7d67f2ecb14cd1bd3b22" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.649348 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4zsvz/must-gather-k5nt7" Mar 20 18:43:41 crc kubenswrapper[4795]: I0320 18:43:41.674629 4795 scope.go:117] "RemoveContainer" containerID="f7dc1c0fd67006d7343e6ade29b0720cb88d319027089039c39df72434041f5f" Mar 20 18:43:43 crc kubenswrapper[4795]: I0320 18:43:43.270549 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" path="/var/lib/kubelet/pods/6e72ecdf-d9fb-433e-b04d-4807b5e60c2e/volumes" Mar 20 18:43:54 crc kubenswrapper[4795]: I0320 18:43:54.252887 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:43:54 crc kubenswrapper[4795]: E0320 18:43:54.253773 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.165502 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4vtjv"] Mar 20 18:44:00 crc kubenswrapper[4795]: E0320 18:44:00.166317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e41958-5524-4e61-8976-654c68baf648" containerName="oc" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e41958-5524-4e61-8976-654c68baf648" containerName="oc" Mar 20 18:44:00 crc kubenswrapper[4795]: E0320 18:44:00.166343 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" Mar 20 18:44:00 crc kubenswrapper[4795]: E0320 18:44:00.166359 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="gather" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="gather" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166540 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="gather" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166553 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e72ecdf-d9fb-433e-b04d-4807b5e60c2e" containerName="copy" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.166566 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e41958-5524-4e61-8976-654c68baf648" containerName="oc" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.167164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.173261 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.173439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.173467 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.183771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4vtjv"] Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.264741 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"auto-csr-approver-29567204-4vtjv\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.366503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"auto-csr-approver-29567204-4vtjv\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.399924 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"auto-csr-approver-29567204-4vtjv\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.496204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:00 crc kubenswrapper[4795]: I0320 18:44:00.930112 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567204-4vtjv"] Mar 20 18:44:01 crc kubenswrapper[4795]: I0320 18:44:01.892488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" event={"ID":"88a0b3d5-0037-474d-8d0f-d79e18c3acd0","Type":"ContainerStarted","Data":"ea464badde149bfd60c15f486577ff1bcd0f13bd6337cf69ad9c7fdd6da3f530"} Mar 20 18:44:02 crc kubenswrapper[4795]: I0320 18:44:02.905280 4795 generic.go:334] "Generic (PLEG): container finished" podID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerID="77abde6c25e56b5397403eb84648ee3aa367f56e27e7524871903de3ff0f586b" exitCode=0 Mar 20 18:44:02 crc kubenswrapper[4795]: I0320 18:44:02.905352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" event={"ID":"88a0b3d5-0037-474d-8d0f-d79e18c3acd0","Type":"ContainerDied","Data":"77abde6c25e56b5397403eb84648ee3aa367f56e27e7524871903de3ff0f586b"} Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.274559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.350244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") pod \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\" (UID: \"88a0b3d5-0037-474d-8d0f-d79e18c3acd0\") " Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.356052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w" (OuterVolumeSpecName: "kube-api-access-b9s8w") pod "88a0b3d5-0037-474d-8d0f-d79e18c3acd0" (UID: "88a0b3d5-0037-474d-8d0f-d79e18c3acd0"). InnerVolumeSpecName "kube-api-access-b9s8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.454163 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9s8w\" (UniqueName: \"kubernetes.io/projected/88a0b3d5-0037-474d-8d0f-d79e18c3acd0-kube-api-access-b9s8w\") on node \"crc\" DevicePath \"\"" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.939614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" event={"ID":"88a0b3d5-0037-474d-8d0f-d79e18c3acd0","Type":"ContainerDied","Data":"ea464badde149bfd60c15f486577ff1bcd0f13bd6337cf69ad9c7fdd6da3f530"} Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.939661 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567204-4vtjv" Mar 20 18:44:04 crc kubenswrapper[4795]: I0320 18:44:04.939662 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea464badde149bfd60c15f486577ff1bcd0f13bd6337cf69ad9c7fdd6da3f530" Mar 20 18:44:05 crc kubenswrapper[4795]: I0320 18:44:05.345455 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:44:05 crc kubenswrapper[4795]: I0320 18:44:05.360505 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567198-hmxc4"] Mar 20 18:44:07 crc kubenswrapper[4795]: I0320 18:44:07.264623 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:07 crc kubenswrapper[4795]: E0320 18:44:07.265342 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:07 crc kubenswrapper[4795]: I0320 18:44:07.266247 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73777af0-dee3-47d4-a9d2-a48649e84e4d" path="/var/lib/kubelet/pods/73777af0-dee3-47d4-a9d2-a48649e84e4d/volumes" Mar 20 18:44:13 crc kubenswrapper[4795]: I0320 18:44:13.324324 4795 scope.go:117] "RemoveContainer" containerID="6303fb5093a0bb0022b32e4ef548448b867a34002a11dd9fd46d7dd786dcfd17" Mar 20 18:44:21 crc kubenswrapper[4795]: I0320 18:44:21.252087 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:21 crc kubenswrapper[4795]: E0320 18:44:21.253100 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:36 crc kubenswrapper[4795]: I0320 18:44:36.253892 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:36 crc kubenswrapper[4795]: E0320 18:44:36.254647 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:47 crc kubenswrapper[4795]: I0320 18:44:47.265387 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:47 crc kubenswrapper[4795]: E0320 18:44:47.266470 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:44:59 crc kubenswrapper[4795]: I0320 18:44:59.252680 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:44:59 crc kubenswrapper[4795]: E0320 18:44:59.253381 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.176314 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2"] Mar 20 18:45:00 crc kubenswrapper[4795]: E0320 18:45:00.177325 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.177422 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.177805 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a0b3d5-0037-474d-8d0f-d79e18c3acd0" containerName="oc" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.178670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.183105 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.186232 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.191258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2"] Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.245498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.245666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.245746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.347443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.347505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.347591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.349713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.356614 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.368463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"collect-profiles-29567205-qr2f2\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:00 crc kubenswrapper[4795]: I0320 18:45:00.513985 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.007148 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2"] Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.551795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerStarted","Data":"381f43dd9727e0d0e774bc9e5c1fcddd9c4d5a38fc19af2de1dd7039022c1b0a"} Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.552086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerStarted","Data":"9d0ab81b020e431a087fdff70e65f982f5a02104ccbe929ba497588992f7a468"} Mar 20 18:45:01 crc kubenswrapper[4795]: I0320 18:45:01.573456 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" podStartSLOduration=1.5734310919999999 podStartE2EDuration="1.573431092s" podCreationTimestamp="2026-03-20 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:45:01.570094448 +0000 UTC m=+5245.028126049" watchObservedRunningTime="2026-03-20 18:45:01.573431092 +0000 UTC m=+5245.031462663" Mar 20 18:45:02 crc kubenswrapper[4795]: I0320 18:45:02.566534 4795 generic.go:334] "Generic (PLEG): container finished" podID="4478c18d-2e03-416d-b738-d06a34d5291e" containerID="381f43dd9727e0d0e774bc9e5c1fcddd9c4d5a38fc19af2de1dd7039022c1b0a" exitCode=0 Mar 20 18:45:02 crc kubenswrapper[4795]: I0320 18:45:02.566597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerDied","Data":"381f43dd9727e0d0e774bc9e5c1fcddd9c4d5a38fc19af2de1dd7039022c1b0a"} Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.588290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" event={"ID":"4478c18d-2e03-416d-b738-d06a34d5291e","Type":"ContainerDied","Data":"9d0ab81b020e431a087fdff70e65f982f5a02104ccbe929ba497588992f7a468"} Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.588952 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d0ab81b020e431a087fdff70e65f982f5a02104ccbe929ba497588992f7a468" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.876664 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.957411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") pod \"4478c18d-2e03-416d-b738-d06a34d5291e\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.957640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") pod \"4478c18d-2e03-416d-b738-d06a34d5291e\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.957727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") pod \"4478c18d-2e03-416d-b738-d06a34d5291e\" (UID: \"4478c18d-2e03-416d-b738-d06a34d5291e\") " Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.958992 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume" (OuterVolumeSpecName: "config-volume") pod "4478c18d-2e03-416d-b738-d06a34d5291e" (UID: "4478c18d-2e03-416d-b738-d06a34d5291e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.963393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4478c18d-2e03-416d-b738-d06a34d5291e" (UID: "4478c18d-2e03-416d-b738-d06a34d5291e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:45:04 crc kubenswrapper[4795]: I0320 18:45:04.963828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc" (OuterVolumeSpecName: "kube-api-access-h7njc") pod "4478c18d-2e03-416d-b738-d06a34d5291e" (UID: "4478c18d-2e03-416d-b738-d06a34d5291e"). InnerVolumeSpecName "kube-api-access-h7njc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.059914 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4478c18d-2e03-416d-b738-d06a34d5291e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.059946 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7njc\" (UniqueName: \"kubernetes.io/projected/4478c18d-2e03-416d-b738-d06a34d5291e-kube-api-access-h7njc\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.059958 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4478c18d-2e03-416d-b738-d06a34d5291e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.601325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567205-qr2f2" Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.962563 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:45:05 crc kubenswrapper[4795]: I0320 18:45:05.972760 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-pnnff"] Mar 20 18:45:07 crc kubenswrapper[4795]: I0320 18:45:07.276412 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06aca85b-9cb4-47ae-ad12-b1cc429c542d" path="/var/lib/kubelet/pods/06aca85b-9cb4-47ae-ad12-b1cc429c542d/volumes" Mar 20 18:45:11 crc kubenswrapper[4795]: I0320 18:45:11.252480 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:11 crc kubenswrapper[4795]: E0320 18:45:11.253532 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:13 crc kubenswrapper[4795]: I0320 18:45:13.457279 4795 scope.go:117] "RemoveContainer" containerID="1dda362040903edd45488eb5dfa4174252f2f44818cf7249d6ad4da4aa90fe4e" Mar 20 18:45:25 crc kubenswrapper[4795]: I0320 18:45:25.252909 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:25 crc kubenswrapper[4795]: E0320 18:45:25.255912 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.943117 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:36 crc kubenswrapper[4795]: E0320 18:45:36.944200 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4478c18d-2e03-416d-b738-d06a34d5291e" containerName="collect-profiles" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.944223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4478c18d-2e03-416d-b738-d06a34d5291e" containerName="collect-profiles" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.944518 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4478c18d-2e03-416d-b738-d06a34d5291e" containerName="collect-profiles" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.946402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:36 crc kubenswrapper[4795]: I0320 18:45:36.970770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.106454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.106825 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.107088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.208792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.208903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.208939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.209540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.209562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.235188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"community-operators-smws7\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.280345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.781070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:37 crc kubenswrapper[4795]: W0320 18:45:37.784639 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3995cb6e_a1d8_4a7b_98ae_ae112622968f.slice/crio-1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94 WatchSource:0}: Error finding container 1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94: Status 404 returned error can't find the container with id 1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94 Mar 20 18:45:37 crc kubenswrapper[4795]: I0320 18:45:37.968978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerStarted","Data":"1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94"} Mar 20 18:45:38 crc kubenswrapper[4795]: I0320 18:45:38.986124 4795 generic.go:334] "Generic (PLEG): container finished" podID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" exitCode=0 Mar 20 18:45:38 crc kubenswrapper[4795]: I0320 18:45:38.986181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f"} Mar 20 18:45:38 crc kubenswrapper[4795]: I0320 18:45:38.988210 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.252505 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:39 crc kubenswrapper[4795]: E0320 18:45:39.252908 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.922801 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.926307 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:39 crc kubenswrapper[4795]: I0320 18:45:39.944059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.004370 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerStarted","Data":"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd"} Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.070060 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.070362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.070428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.172516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.174043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.191002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"redhat-operators-2g56c\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.296490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:40 crc kubenswrapper[4795]: I0320 18:45:40.743086 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.017168 4795 generic.go:334] "Generic (PLEG): container finished" podID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" exitCode=0 Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.017234 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd"} Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.021462 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9212b67-95e9-4502-a467-15325eab4f0f" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" exitCode=0 Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.021504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b"} Mar 20 18:45:41 crc kubenswrapper[4795]: I0320 18:45:41.021535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerStarted","Data":"4af5126384c0f89ed19f023b9ec49a9c682a00720cb7c5fda5444fa7e7ef8d5c"} Mar 20 18:45:42 crc kubenswrapper[4795]: I0320 18:45:42.030999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerStarted","Data":"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac"} Mar 20 18:45:42 crc kubenswrapper[4795]: I0320 18:45:42.034402 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerStarted","Data":"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2"} Mar 20 18:45:42 crc kubenswrapper[4795]: I0320 18:45:42.079272 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smws7" podStartSLOduration=3.519353844 podStartE2EDuration="6.07925327s" podCreationTimestamp="2026-03-20 18:45:36 +0000 UTC" firstStartedPulling="2026-03-20 18:45:38.987978136 +0000 UTC m=+5282.446009677" lastFinishedPulling="2026-03-20 18:45:41.547877562 +0000 UTC m=+5285.005909103" observedRunningTime="2026-03-20 18:45:42.067508677 +0000 UTC m=+5285.525540268" watchObservedRunningTime="2026-03-20 18:45:42.07925327 +0000 UTC m=+5285.537285061" Mar 20 18:45:43 crc kubenswrapper[4795]: I0320 18:45:43.044112 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9212b67-95e9-4502-a467-15325eab4f0f" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" exitCode=0 Mar 20 18:45:43 crc kubenswrapper[4795]: I0320 18:45:43.044960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac"} Mar 20 18:45:44 crc kubenswrapper[4795]: I0320 18:45:44.054372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerStarted","Data":"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557"} Mar 20 18:45:44 crc kubenswrapper[4795]: I0320 18:45:44.075074 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2g56c" podStartSLOduration=2.582040582 podStartE2EDuration="5.075047105s" podCreationTimestamp="2026-03-20 18:45:39 +0000 UTC" firstStartedPulling="2026-03-20 18:45:41.022939123 +0000 UTC m=+5284.480970674" lastFinishedPulling="2026-03-20 18:45:43.515945646 +0000 UTC m=+5286.973977197" observedRunningTime="2026-03-20 18:45:44.069308457 +0000 UTC m=+5287.527340008" watchObservedRunningTime="2026-03-20 18:45:44.075047105 +0000 UTC m=+5287.533078736" Mar 20 18:45:47 crc kubenswrapper[4795]: I0320 18:45:47.280930 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:47 crc kubenswrapper[4795]: I0320 18:45:47.281407 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:47 crc kubenswrapper[4795]: I0320 18:45:47.711035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:48 crc kubenswrapper[4795]: I0320 18:45:48.145117 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:49 crc kubenswrapper[4795]: I0320 18:45:49.112475 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:50 crc kubenswrapper[4795]: I0320 18:45:50.121341 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smws7" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" containerID="cri-o://485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" gracePeriod=2 Mar 20 18:45:50 crc kubenswrapper[4795]: I0320 18:45:50.297818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:50 crc kubenswrapper[4795]: I0320 18:45:50.297885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.062506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133121 4795 generic.go:334] "Generic (PLEG): container finished" podID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" exitCode=0 Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133166 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2"} Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133181 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smws7" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smws7" event={"ID":"3995cb6e-a1d8-4a7b-98ae-ae112622968f","Type":"ContainerDied","Data":"1bea37a6f6c90e3513c9c67e7d784990bf4234b8ca3db0d23667ca70bb108f94"} Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.133209 4795 scope.go:117] "RemoveContainer" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.150341 4795 scope.go:117] "RemoveContainer" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.167826 4795 scope.go:117] "RemoveContainer" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.192659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") pod \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.192817 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") pod \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.192951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") pod \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\" (UID: \"3995cb6e-a1d8-4a7b-98ae-ae112622968f\") " Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.193617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities" (OuterVolumeSpecName: "utilities") pod "3995cb6e-a1d8-4a7b-98ae-ae112622968f" (UID: "3995cb6e-a1d8-4a7b-98ae-ae112622968f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.193985 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.200077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst" (OuterVolumeSpecName: "kube-api-access-67pst") pod "3995cb6e-a1d8-4a7b-98ae-ae112622968f" (UID: "3995cb6e-a1d8-4a7b-98ae-ae112622968f"). InnerVolumeSpecName "kube-api-access-67pst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.206150 4795 scope.go:117] "RemoveContainer" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" Mar 20 18:45:51 crc kubenswrapper[4795]: E0320 18:45:51.207005 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2\": container with ID starting with 485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2 not found: ID does not exist" containerID="485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207062 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2"} err="failed to get container status \"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2\": rpc error: code = NotFound desc = could not find container \"485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2\": container with ID starting with 485294ed9c99b6b015c444e2c4045f13d063a9b7f2c58d003c64987530abc5b2 not found: ID does not exist" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207097 4795 scope.go:117] "RemoveContainer" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" Mar 20 18:45:51 crc kubenswrapper[4795]: E0320 18:45:51.207478 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd\": container with ID starting with 75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd not found: ID does not exist" containerID="75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207547 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd"} err="failed to get container status \"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd\": rpc error: code = NotFound desc = could not find container \"75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd\": container with ID starting with 75036611d64a0af0443e45cad8d95b32467052eac15cde6e662c5dc635f494dd not found: ID does not exist" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.207585 4795 scope.go:117] "RemoveContainer" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" Mar 20 18:45:51 crc kubenswrapper[4795]: E0320 18:45:51.207999 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f\": container with ID starting with 9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f not found: ID does not exist" containerID="9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.208038 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f"} err="failed to get container status \"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f\": rpc error: code = NotFound desc = could not find container \"9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f\": container with ID starting with 9565cda3ca2b443618c8170243dc539f7cb02747e47c0f479f23506eb3de4c1f not found: ID does not exist" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.241657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3995cb6e-a1d8-4a7b-98ae-ae112622968f" (UID: "3995cb6e-a1d8-4a7b-98ae-ae112622968f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.296532 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3995cb6e-a1d8-4a7b-98ae-ae112622968f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.296581 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pst\" (UniqueName: \"kubernetes.io/projected/3995cb6e-a1d8-4a7b-98ae-ae112622968f-kube-api-access-67pst\") on node \"crc\" DevicePath \"\"" Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.350726 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2g56c" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" probeResult="failure" output=< Mar 20 18:45:51 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 20 18:45:51 crc kubenswrapper[4795]: > Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.467643 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:51 crc kubenswrapper[4795]: I0320 18:45:51.478141 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smws7"] Mar 20 18:45:53 crc kubenswrapper[4795]: I0320 18:45:53.252295 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:45:53 crc kubenswrapper[4795]: E0320 18:45:53.253114 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:45:53 crc kubenswrapper[4795]: I0320 18:45:53.265666 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" path="/var/lib/kubelet/pods/3995cb6e-a1d8-4a7b-98ae-ae112622968f/volumes" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166000 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567206-h4p5r"] Mar 20 18:46:00 crc kubenswrapper[4795]: E0320 18:46:00.166886 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166901 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" Mar 20 18:46:00 crc kubenswrapper[4795]: E0320 18:46:00.166919 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-content" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166927 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-content" Mar 20 18:46:00 crc kubenswrapper[4795]: E0320 18:46:00.166967 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-utilities" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.166975 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="extract-utilities" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.167172 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3995cb6e-a1d8-4a7b-98ae-ae112622968f" containerName="registry-server" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.167881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.172196 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.172654 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.172661 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.195003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567206-h4p5r"] Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.347677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"auto-csr-approver-29567206-h4p5r\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.348948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.402730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.463182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"auto-csr-approver-29567206-h4p5r\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.494581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"auto-csr-approver-29567206-h4p5r\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.503495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.623507 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:46:00 crc kubenswrapper[4795]: W0320 18:46:00.989984 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ce9f713_91c6_4873_90e4_740174e9e0d5.slice/crio-caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2 WatchSource:0}: Error finding container caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2: Status 404 returned error can't find the container with id caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2 Mar 20 18:46:00 crc kubenswrapper[4795]: I0320 18:46:00.990762 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567206-h4p5r"] Mar 20 18:46:01 crc kubenswrapper[4795]: I0320 18:46:01.271780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" event={"ID":"9ce9f713-91c6-4873-90e4-740174e9e0d5","Type":"ContainerStarted","Data":"caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2"} Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.267707 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2g56c" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" containerID="cri-o://f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" gracePeriod=2 Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.772540 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.905669 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") pod \"f9212b67-95e9-4502-a467-15325eab4f0f\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.905845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") pod \"f9212b67-95e9-4502-a467-15325eab4f0f\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.905909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") pod \"f9212b67-95e9-4502-a467-15325eab4f0f\" (UID: \"f9212b67-95e9-4502-a467-15325eab4f0f\") " Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.906671 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities" (OuterVolumeSpecName: "utilities") pod "f9212b67-95e9-4502-a467-15325eab4f0f" (UID: "f9212b67-95e9-4502-a467-15325eab4f0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:46:02 crc kubenswrapper[4795]: I0320 18:46:02.914476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn" (OuterVolumeSpecName: "kube-api-access-tz9pn") pod "f9212b67-95e9-4502-a467-15325eab4f0f" (UID: "f9212b67-95e9-4502-a467-15325eab4f0f"). InnerVolumeSpecName "kube-api-access-tz9pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.008591 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.008639 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz9pn\" (UniqueName: \"kubernetes.io/projected/f9212b67-95e9-4502-a467-15325eab4f0f-kube-api-access-tz9pn\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.076293 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9212b67-95e9-4502-a467-15325eab4f0f" (UID: "f9212b67-95e9-4502-a467-15325eab4f0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.125168 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9212b67-95e9-4502-a467-15325eab4f0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284353 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9212b67-95e9-4502-a467-15325eab4f0f" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" exitCode=0 Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557"} Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284442 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g56c" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g56c" event={"ID":"f9212b67-95e9-4502-a467-15325eab4f0f","Type":"ContainerDied","Data":"4af5126384c0f89ed19f023b9ec49a9c682a00720cb7c5fda5444fa7e7ef8d5c"} Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.284479 4795 scope.go:117] "RemoveContainer" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.287425 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerID="dd79648dd7c7f0b83d35341008cda3ef6a0f783a3f1f580f77e1002265d29e48" exitCode=0 Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.287516 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" event={"ID":"9ce9f713-91c6-4873-90e4-740174e9e0d5","Type":"ContainerDied","Data":"dd79648dd7c7f0b83d35341008cda3ef6a0f783a3f1f580f77e1002265d29e48"} Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.319529 4795 scope.go:117] "RemoveContainer" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.349803 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.350301 4795 scope.go:117] "RemoveContainer" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.358292 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2g56c"] Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406055 4795 scope.go:117] "RemoveContainer" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" Mar 20 18:46:03 crc kubenswrapper[4795]: E0320 18:46:03.406481 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557\": container with ID starting with f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557 not found: ID does not exist" containerID="f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406518 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557"} err="failed to get container status \"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557\": rpc error: code = NotFound desc = could not find container \"f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557\": container with ID starting with f08e2d2a5ddb83aeae19a43ce9f37f639f887283b5452b849cd35e7667de7557 not found: ID does not exist" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406551 4795 scope.go:117] "RemoveContainer" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" Mar 20 18:46:03 crc kubenswrapper[4795]: E0320 18:46:03.406838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac\": container with ID starting with 0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac not found: ID does not exist" containerID="0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.406953 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac"} err="failed to get container status \"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac\": rpc error: code = NotFound desc = could not find container \"0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac\": container with ID starting with 0ed27ef503d5e90ff28598cb39f9fbaacf82e27a6d853cde9d705cf6ee6979ac not found: ID does not exist" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.407073 4795 scope.go:117] "RemoveContainer" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" Mar 20 18:46:03 crc kubenswrapper[4795]: E0320 18:46:03.407438 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b\": container with ID starting with 43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b not found: ID does not exist" containerID="43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b" Mar 20 18:46:03 crc kubenswrapper[4795]: I0320 18:46:03.407467 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b"} err="failed to get container status \"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b\": rpc error: code = NotFound desc = could not find container \"43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b\": container with ID starting with 43d7a71103325de63d08ee5e10bedbf6efba091321262a0098d2fe14d153f98b not found: ID does not exist" Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.685739 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.765217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") pod \"9ce9f713-91c6-4873-90e4-740174e9e0d5\" (UID: \"9ce9f713-91c6-4873-90e4-740174e9e0d5\") " Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.771043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r" (OuterVolumeSpecName: "kube-api-access-tgd9r") pod "9ce9f713-91c6-4873-90e4-740174e9e0d5" (UID: "9ce9f713-91c6-4873-90e4-740174e9e0d5"). InnerVolumeSpecName "kube-api-access-tgd9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:46:04 crc kubenswrapper[4795]: I0320 18:46:04.867174 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgd9r\" (UniqueName: \"kubernetes.io/projected/9ce9f713-91c6-4873-90e4-740174e9e0d5-kube-api-access-tgd9r\") on node \"crc\" DevicePath \"\"" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.261722 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" path="/var/lib/kubelet/pods/f9212b67-95e9-4502-a467-15325eab4f0f/volumes" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.310710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" event={"ID":"9ce9f713-91c6-4873-90e4-740174e9e0d5","Type":"ContainerDied","Data":"caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2"} Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.310752 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf9015b8246700c1d0223b914d579eb1d45bfae9346e9481f24c5c3ef5a3ca2" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.310764 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567206-h4p5r" Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.763344 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:46:05 crc kubenswrapper[4795]: I0320 18:46:05.773122 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567200-sb6x8"] Mar 20 18:46:06 crc kubenswrapper[4795]: I0320 18:46:06.252234 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:06 crc kubenswrapper[4795]: E0320 18:46:06.252636 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:07 crc kubenswrapper[4795]: I0320 18:46:07.277720 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b0a7d2-ac09-4b84-8083-48c33d97b032" path="/var/lib/kubelet/pods/37b0a7d2-ac09-4b84-8083-48c33d97b032/volumes" Mar 20 18:46:13 crc kubenswrapper[4795]: I0320 18:46:13.602502 4795 scope.go:117] "RemoveContainer" containerID="f6bd6f59f3702ed1b553664f1b56ecda875d8c79fa2d467d34f79e36c2a97634" Mar 20 18:46:18 crc kubenswrapper[4795]: I0320 18:46:18.252703 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:18 crc kubenswrapper[4795]: E0320 18:46:18.253421 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:31 crc kubenswrapper[4795]: I0320 18:46:31.252807 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:31 crc kubenswrapper[4795]: E0320 18:46:31.253634 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:43 crc kubenswrapper[4795]: I0320 18:46:43.254292 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:43 crc kubenswrapper[4795]: E0320 18:46:43.255511 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:46:54 crc kubenswrapper[4795]: I0320 18:46:54.252875 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:46:54 crc kubenswrapper[4795]: E0320 18:46:54.253503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:07 crc kubenswrapper[4795]: I0320 18:47:07.265100 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:07 crc kubenswrapper[4795]: E0320 18:47:07.265751 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:20 crc kubenswrapper[4795]: I0320 18:47:20.251998 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:20 crc kubenswrapper[4795]: E0320 18:47:20.253249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:34 crc kubenswrapper[4795]: I0320 18:47:34.253500 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:34 crc kubenswrapper[4795]: E0320 18:47:34.254504 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:47:46 crc kubenswrapper[4795]: I0320 18:47:46.251846 4795 scope.go:117] "RemoveContainer" containerID="4b616b51435ca9ee3052fcea7ca675c8e1fa2d1f4e981c759bf1d4564a9af5a8" Mar 20 18:47:46 crc kubenswrapper[4795]: E0320 18:47:46.252606 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mvxvt_openshift-machine-config-operator(8702afd1-abd3-42d0-91e6-048802e98829)\"" pod="openshift-machine-config-operator/machine-config-daemon-mvxvt" podUID="8702afd1-abd3-42d0-91e6-048802e98829" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.138953 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567208-2xwg5"] Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140080 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140099 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140106 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140138 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-utilities" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140147 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-utilities" Mar 20 18:48:00 crc kubenswrapper[4795]: E0320 18:48:00.140167 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-content" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140172 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="extract-content" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140338 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9212b67-95e9-4502-a467-15325eab4f0f" containerName="registry-server" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.140353 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce9f713-91c6-4873-90e4-740174e9e0d5" containerName="oc" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.141158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.143564 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.144253 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-wrx75" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.146566 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.147619 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567208-2xwg5"] Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.218609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnc7\" (UniqueName: \"kubernetes.io/projected/99657a8e-1528-44bf-a0b2-3098aa05b8bb-kube-api-access-knnc7\") pod \"auto-csr-approver-29567208-2xwg5\" (UID: \"99657a8e-1528-44bf-a0b2-3098aa05b8bb\") " pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.321184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knnc7\" (UniqueName: \"kubernetes.io/projected/99657a8e-1528-44bf-a0b2-3098aa05b8bb-kube-api-access-knnc7\") pod \"auto-csr-approver-29567208-2xwg5\" (UID: \"99657a8e-1528-44bf-a0b2-3098aa05b8bb\") " pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.341331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnc7\" (UniqueName: \"kubernetes.io/projected/99657a8e-1528-44bf-a0b2-3098aa05b8bb-kube-api-access-knnc7\") pod \"auto-csr-approver-29567208-2xwg5\" (UID: \"99657a8e-1528-44bf-a0b2-3098aa05b8bb\") " pod="openshift-infra/auto-csr-approver-29567208-2xwg5" Mar 20 18:48:00 crc kubenswrapper[4795]: I0320 18:48:00.457184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567208-2xwg5" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157313151024447 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157313152017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157300045016505 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157300045015455 5ustar corecore